Mar 19 18:56:28 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 18:56:28 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:28 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:29 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 18:56:30 crc kubenswrapper[5033]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.346618 5033 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353484 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353511 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353521 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353529 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353537 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353549 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353560 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353569 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353578 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353585 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353593 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353602 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353610 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353619 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353627 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353635 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353644 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353651 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353659 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353666 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353674 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353682 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353690 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353697 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353705 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353712 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353720 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353727 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353735 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353742 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353751 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353759 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353767 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353776 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353783 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353791 5033 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353798 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353806 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353814 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353824 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353834 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353843 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353851 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353863 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353872 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353880 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353888 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353898 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353907 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353915 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353922 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353930 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353937 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353945 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353952 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353960 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353969 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353978 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353985 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.353993 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354000 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354008 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354015 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354025 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354033 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354040 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354047 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354055 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354063 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354072 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.354079 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354216 5033 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354232 5033 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354246 5033 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354257 5033 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354273 5033 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354282 5033 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354293 5033 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354307 5033 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354317 5033 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354326 5033 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354336 5033 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354345 5033 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354354 5033 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354364 5033 flags.go:64] FLAG: --cgroup-root="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354372 5033 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354381 5033 flags.go:64] FLAG: --client-ca-file="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354390 5033 flags.go:64] FLAG: --cloud-config="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354400 5033 flags.go:64] FLAG: --cloud-provider="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354409 5033 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354420 5033 flags.go:64] FLAG: --cluster-domain="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354429 5033 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354438 5033 flags.go:64] FLAG: --config-dir="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354471 5033 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354482 5033 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354493 5033 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354501 5033 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354510 5033 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354520 5033 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354529 5033 flags.go:64] FLAG: --contention-profiling="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354538 5033 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354546 5033 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354556 5033 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354564 5033 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354575 5033 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354587 5033 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354597 5033 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354607 5033 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354616 5033 flags.go:64] FLAG: --enable-server="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354624 5033 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354635 5033 flags.go:64] FLAG: --event-burst="100" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354645 5033 flags.go:64] FLAG: --event-qps="50" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354653 5033 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354662 5033 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354670 5033 flags.go:64] FLAG: --eviction-hard="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354681 5033 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354690 5033 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354698 5033 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354707 5033 flags.go:64] FLAG: --eviction-soft="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354717 5033 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354727 5033 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354736 5033 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354745 5033 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354754 5033 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354763 5033 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354772 5033 flags.go:64] FLAG: --feature-gates="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354782 5033 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354791 5033 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354801 5033 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354810 5033 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354818 5033 flags.go:64] FLAG: --healthz-port="10248" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354828 5033 flags.go:64] FLAG: --help="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354837 5033 flags.go:64] FLAG: --hostname-override="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354845 5033 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354854 5033 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354863 5033 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354872 5033 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354881 5033 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354889 5033 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354898 5033 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354907 5033 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354917 5033 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354926 5033 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354935 5033 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354944 5033 flags.go:64] FLAG: --kube-reserved="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354953 5033 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354961 5033 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354971 5033 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354980 5033 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354988 5033 flags.go:64] FLAG: --lock-file="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.354997 5033 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355008 5033 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355018 5033 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355030 5033 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355039 5033 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355048 5033 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355057 5033 flags.go:64] FLAG: --logging-format="text" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355066 5033 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355075 5033 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355084 5033 flags.go:64] FLAG: --manifest-url="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355093 5033 flags.go:64] FLAG: --manifest-url-header="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355104 5033 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355114 5033 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355124 5033 flags.go:64] FLAG: --max-pods="110" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355133 5033 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355142 5033 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355151 5033 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355160 5033 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355169 5033 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355177 5033 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355186 5033 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355204 5033 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355213 5033 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355222 5033 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355231 5033 flags.go:64] FLAG: --pod-cidr="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355240 5033 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355252 5033 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355263 5033 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355272 5033 flags.go:64] FLAG: --pods-per-core="0" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355281 5033 flags.go:64] FLAG: --port="10250" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355289 5033 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355298 5033 flags.go:64] FLAG: --provider-id="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355307 5033 flags.go:64] FLAG: --qos-reserved="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355316 5033 flags.go:64] FLAG: --read-only-port="10255" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355325 5033 flags.go:64] FLAG: --register-node="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355334 5033 flags.go:64] FLAG: --register-schedulable="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355342 5033 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355356 5033 flags.go:64] FLAG: --registry-burst="10" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355365 5033 flags.go:64] FLAG: --registry-qps="5" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355373 5033 flags.go:64] FLAG: --reserved-cpus="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355382 5033 flags.go:64] FLAG: --reserved-memory="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355392 5033 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355407 5033 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355416 5033 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355425 5033 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355434 5033 flags.go:64] FLAG: --runonce="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355443 5033 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355476 5033 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355486 5033 flags.go:64] FLAG: --seccomp-default="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355495 5033 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355503 5033 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355512 5033 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355557 5033 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355567 5033 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355577 5033 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355586 5033 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355595 5033 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355604 5033 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355613 5033 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355622 5033 flags.go:64] FLAG: --system-cgroups="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355631 5033 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355645 5033 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355653 5033 flags.go:64] FLAG: --tls-cert-file="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355662 5033 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355674 5033 flags.go:64] FLAG: --tls-min-version="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355683 5033 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355692 5033 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355701 5033 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355710 5033 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355719 5033 flags.go:64] FLAG: --v="2" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355729 5033 flags.go:64] FLAG: --version="false" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355740 5033 flags.go:64] FLAG: --vmodule="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355751 5033 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.355760 5033 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.355958 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.355968 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.355977 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.355985 5033 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.355994 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356002 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356010 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356018 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356025 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356033 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356041 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356049 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356057 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356064 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356072 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356079 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356087 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356094 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356102 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356110 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356118 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356126 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356134 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356141 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356149 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356158 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356166 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356174 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356181 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356188 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356196 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356204 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356212 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356220 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356227 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356235 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356243 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356251 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356258 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356266 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356273 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356281 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356289 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356297 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356305 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356312 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356320 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356328 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356338 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356347 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356355 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356363 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356371 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356378 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356386 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356393 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356402 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356409 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356418 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356426 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356436 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356472 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356483 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356492 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356501 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356509 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356517 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356525 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356533 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356543 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.356553 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.357180 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.367664 5033 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.367716 5033 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367853 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367875 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367886 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367895 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367905 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367915 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367923 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367932 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367940 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367949 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367958 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367969 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367978 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367988 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.367998 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368010 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368024 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368034 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368044 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368054 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368064 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368074 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368088 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368103 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368114 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368125 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368136 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368146 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368155 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368165 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368175 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368188 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368199 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368209 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368219 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368229 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368240 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368250 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368261 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368271 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368281 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368289 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368298 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368307 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368315 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368323 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368334 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368345 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368354 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368364 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368373 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368381 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368394 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368403 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368411 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368419 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368427 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368435 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368444 5033 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368485 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368494 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368502 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368510 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368521 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368530 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368538 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368547 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368555 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368564 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368575 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368585 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.368600 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368835 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368888 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368905 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368920 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368933 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368944 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368956 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368966 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368975 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368984 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.368997 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369008 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369018 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369033 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369045 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369057 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369070 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369080 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369089 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369099 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369110 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369120 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369134 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369149 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369161 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369172 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369181 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369189 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369197 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369205 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369213 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369223 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369232 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369241 5033 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369250 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369258 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369266 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369274 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369283 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369291 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369299 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369308 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369317 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369325 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369333 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369341 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369349 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369358 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369366 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369374 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369383 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369391 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369400 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369409 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369417 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369425 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369433 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369442 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369488 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369497 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369505 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369513 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369524 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369536 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369548 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369559 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369568 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369577 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369586 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369594 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.369602 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.369616 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.370764 5033 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.376161 5033 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.381149 5033 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.381317 5033 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.383855 5033 server.go:997] "Starting client certificate rotation" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.383909 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.385599 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.410046 5033 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.413057 5033 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.413796 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.431981 5033 log.go:25] "Validated CRI v1 runtime API" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.473055 5033 log.go:25] "Validated CRI v1 image API" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.478528 5033 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.485895 5033 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-18-51-54-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.485949 5033 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.515642 5033 manager.go:217] Machine: {Timestamp:2026-03-19 18:56:30.51230525 +0000 UTC m=+0.617335179 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4c45382c-f0a8-4377-81e5-4e3ff4799b14 BootID:0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:55:7d:3e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:55:7d:3e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ed:ea:1a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b1:8d:5a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:19:a9:39 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d5:fb:99 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:15:fd:32:c5:42 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1a:e4:84:4e:e4:1c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.516032 5033 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.516219 5033 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.518689 5033 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.518911 5033 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.518948 5033 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.519149 5033 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.519159 5033 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.519550 5033 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.520792 5033 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.521018 5033 state_mem.go:36] "Initialized new in-memory state store" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.521103 5033 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.526138 5033 kubelet.go:418] "Attempting to sync node with API server" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.526160 5033 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.526200 5033 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.526215 5033 kubelet.go:324] "Adding apiserver pod source" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.526277 5033 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.530878 5033 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.531966 5033 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.533054 5033 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.533736 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.533745 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.533849 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.533874 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534882 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534906 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534914 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534922 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534934 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534943 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534952 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534963 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534975 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.534983 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.535008 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.535016 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.538485 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.538874 5033 server.go:1280] "Started kubelet" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.539896 5033 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.539901 5033 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.540725 5033 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 18:56:30 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.541239 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.542915 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.542974 5033 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.543145 5033 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.543183 5033 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.543181 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.543279 5033 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.543932 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.544017 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.544332 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.544924 5033 factory.go:55] Registering systemd factory Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.544945 5033 factory.go:221] Registration of the systemd container factory successfully Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551564 5033 factory.go:153] Registering CRI-O factory Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551609 5033 factory.go:221] Registration of the crio container factory successfully Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551716 5033 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551756 5033 factory.go:103] Registering Raw factory Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551860 5033 manager.go:1196] Started watching for new ooms in manager Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.551564 5033 server.go:460] "Adding debug handlers to kubelet server" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.550858 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.556009 5033 manager.go:319] Starting recovery of all containers Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560270 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560380 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560403 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560424 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560440 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560489 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560510 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560528 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560547 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560563 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560580 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560598 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560617 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560644 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560669 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560686 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560704 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560722 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560739 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560758 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560775 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560792 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560809 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560825 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560844 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560861 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560882 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560901 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560919 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560939 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.560955 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561015 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561117 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561144 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561162 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561183 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561204 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561223 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561240 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561257 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561274 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561291 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561312 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561328 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561346 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561368 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561384 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561401 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561418 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561434 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561494 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561513 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561539 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561558 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561577 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561595 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561613 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561630 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561648 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561667 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561755 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561779 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561796 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561814 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561841 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561859 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561879 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561897 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561914 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561933 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561954 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561971 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.561989 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562006 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562023 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562043 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562059 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562079 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562095 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562112 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562127 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562145 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562171 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562187 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562206 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562222 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562241 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562259 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562276 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562291 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562311 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562328 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562348 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562379 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562398 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562417 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562439 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562488 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562508 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562526 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562546 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562565 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562582 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562599 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562624 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562644 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562665 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562683 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562700 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562720 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562738 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562758 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562777 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562795 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562813 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562829 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562845 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562861 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562878 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562895 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562912 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562928 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562944 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562962 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562977 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.562999 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563018 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563035 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563054 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563071 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563089 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563107 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563123 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563143 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563163 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563181 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563198 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563218 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563236 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563253 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563270 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563293 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563309 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563327 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563344 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563360 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563377 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563394 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563648 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563667 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563689 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563707 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563723 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563742 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563759 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563780 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.563797 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569018 5033 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569082 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569103 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569124 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569145 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569159 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569173 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569185 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569202 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569214 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569228 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569256 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569269 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569282 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569295 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.569307 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570012 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570037 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570050 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570271 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570287 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570954 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.570986 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.571001 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572249 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572287 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572309 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572325 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572339 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572352 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572368 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572389 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572408 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572425 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572443 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572484 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572506 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572536 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572554 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572571 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572588 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572607 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572628 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572641 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572655 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572673 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572699 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572721 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572742 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572759 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572771 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572789 5033 reconstruct.go:97] "Volume reconstruction finished" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.572799 5033 reconciler.go:26] "Reconciler: start to sync state" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.576131 5033 manager.go:324] Recovery completed Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.588253 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.590401 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.590543 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.590616 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.593121 5033 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.593138 5033 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.593158 5033 state_mem.go:36] "Initialized new in-memory state store" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.611770 5033 policy_none.go:49] "None policy: Start" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.613192 5033 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.613234 5033 state_mem.go:35] "Initializing new in-memory state store" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.615619 5033 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.619115 5033 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.619159 5033 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.619189 5033 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.619299 5033 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 18:56:30 crc kubenswrapper[5033]: W0319 18:56:30.623481 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.623554 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.644236 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.653403 5033 manager.go:334] "Starting Device Plugin manager" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.653493 5033 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.653507 5033 server.go:79] "Starting device plugin registration server" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.654997 5033 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.655124 5033 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.655302 5033 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.655515 5033 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.655532 5033 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.670117 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.721068 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.721237 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.722680 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.722721 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.722732 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.722899 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.723115 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.723189 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.723859 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.723921 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.723940 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724117 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724142 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724158 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724142 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724278 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724315 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.724984 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725137 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725287 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725326 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725972 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.725983 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726107 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726133 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726145 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726602 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726651 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726671 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.726888 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.727169 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.727260 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.727945 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.727983 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.728004 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.728162 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.728187 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729001 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729038 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729049 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729219 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.729262 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.746902 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.756012 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.757623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.757690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.757704 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.757736 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.758392 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775151 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775250 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775310 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775341 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775395 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775558 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775580 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775671 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775815 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.775923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877668 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877886 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.877924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878083 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878137 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878157 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878196 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878177 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878256 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878264 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878272 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878307 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878331 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878552 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878641 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878660 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878716 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878728 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878764 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878803 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.878778 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.959239 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.961174 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.961241 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.961267 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:30 crc kubenswrapper[5033]: I0319 18:56:30.961306 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:30 crc kubenswrapper[5033]: E0319 18:56:30.961915 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.053412 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.074283 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.090079 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.092748 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e4e1244c494e0bba3a966bb6fd78eb4288c21b51660deec659c95c0ca988b2ec WatchSource:0}: Error finding container e4e1244c494e0bba3a966bb6fd78eb4288c21b51660deec659c95c0ca988b2ec: Status 404 returned error can't find the container with id e4e1244c494e0bba3a966bb6fd78eb4288c21b51660deec659c95c0ca988b2ec Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.101524 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.107997 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.111438 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9c95dddb5e4effa108222b34143c67fd9b3715691dea8941413dfe32cf3d9f7f WatchSource:0}: Error finding container 9c95dddb5e4effa108222b34143c67fd9b3715691dea8941413dfe32cf3d9f7f: Status 404 returned error can't find the container with id 9c95dddb5e4effa108222b34143c67fd9b3715691dea8941413dfe32cf3d9f7f Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.116600 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-06fe23df72f68cd6eba0d9272337c3d95717d45c0dffa2453714bddb4c64b98e WatchSource:0}: Error finding container 06fe23df72f68cd6eba0d9272337c3d95717d45c0dffa2453714bddb4c64b98e: Status 404 returned error can't find the container with id 06fe23df72f68cd6eba0d9272337c3d95717d45c0dffa2453714bddb4c64b98e Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.129783 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-70ab67e4291456f898196e60ed0c53ef56e1ac0577108f234950cec64371b58f WatchSource:0}: Error finding container 70ab67e4291456f898196e60ed0c53ef56e1ac0577108f234950cec64371b58f: Status 404 returned error can't find the container with id 70ab67e4291456f898196e60ed0c53ef56e1ac0577108f234950cec64371b58f Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.136239 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8f8830b230c6fa2a90072dd6e36dc2a6275f38726491db19b7f127fa24746468 WatchSource:0}: Error finding container 8f8830b230c6fa2a90072dd6e36dc2a6275f38726491db19b7f127fa24746468: Status 404 returned error can't find the container with id 8f8830b230c6fa2a90072dd6e36dc2a6275f38726491db19b7f127fa24746468 Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.148033 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.336783 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.336904 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.362581 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.364200 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.364268 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.364287 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.364324 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.365070 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.542360 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.624913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e4e1244c494e0bba3a966bb6fd78eb4288c21b51660deec659c95c0ca988b2ec"} Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.626126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f8830b230c6fa2a90072dd6e36dc2a6275f38726491db19b7f127fa24746468"} Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.627216 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"70ab67e4291456f898196e60ed0c53ef56e1ac0577108f234950cec64371b58f"} Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.627975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c95dddb5e4effa108222b34143c67fd9b3715691dea8941413dfe32cf3d9f7f"} Mar 19 18:56:31 crc kubenswrapper[5033]: I0319 18:56:31.628748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06fe23df72f68cd6eba0d9272337c3d95717d45c0dffa2453714bddb4c64b98e"} Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.649384 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.649505 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.838384 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.838526 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:31 crc kubenswrapper[5033]: W0319 18:56:31.871761 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.871880 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:31 crc kubenswrapper[5033]: E0319 18:56:31.949496 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.166148 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.168325 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.168373 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.168385 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.168414 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:32 crc kubenswrapper[5033]: E0319 18:56:32.168986 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.542697 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.564931 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:32 crc kubenswrapper[5033]: E0319 18:56:32.566496 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.635504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f7f5b49dae611f5407073e82856b8334edbba42df7471c3449cc60f29063961"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.635586 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2264fdc35c4d9916ae573ecd680ea8402f15034bb330c5f81a9fe9b6dba5e844"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.635616 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.635641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"75875553f125bbbd24248852d61935696cf55f7aede9e3a7b10bfbceb78e2d58"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.635648 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.637224 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.637278 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.637304 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.639126 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c" exitCode=0 Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.639329 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.639416 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.640791 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.640856 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.640889 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.642574 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095" exitCode=0 Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.642671 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.642804 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.643584 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645144 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645179 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645204 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645207 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645294 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.645948 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.646108 5033 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="079848c755462140bf93960a91cf2406ccb440d2b436edb2565acf510d4645d8" exitCode=0 Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.646187 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.646225 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"079848c755462140bf93960a91cf2406ccb440d2b436edb2565acf510d4645d8"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.647126 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.647179 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.647199 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.649112 5033 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="936c886a0a2f459bacf4d4b58fb8712de4487363563f54ad57639cde57470237" exitCode=0 Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.649202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"936c886a0a2f459bacf4d4b58fb8712de4487363563f54ad57639cde57470237"} Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.649433 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.652467 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.652527 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.652552 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[5033]: I0319 18:56:32.663201 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:33 crc kubenswrapper[5033]: W0319 18:56:33.292204 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:33 crc kubenswrapper[5033]: E0319 18:56:33.292303 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:33 crc kubenswrapper[5033]: W0319 18:56:33.398482 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:33 crc kubenswrapper[5033]: E0319 18:56:33.398608 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.542280 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 19 18:56:33 crc kubenswrapper[5033]: E0319 18:56:33.551082 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.655575 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e591265d7c66e07b70cc05b9dbe748dee4dfbb9162a8b58888f5ff69b8814a5f"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.655634 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.656836 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.656900 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.656919 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.665369 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f17dfe501e85d9a317797f154e9e83b979d23ddbeb7f48ecd84a08b70c7d4eb"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.665440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dcfaa3c6eaa1194cfb579bc459892ec225356ca72de8858bd834d5dcd43021e4"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.665502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"745ba9f20a528a654296ef3da026885b2abd8d465561231b639d5da211bf298f"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.665683 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.668311 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.668365 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.668386 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3a05f948f36f9e83d9469f2b81ae2c94b19a28ca63466f44427eccae0847c75"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674834 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674857 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.674920 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.675839 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.675874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.675886 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.676673 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f" exitCode=0 Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.676783 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f"} Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.676870 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.676906 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678311 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678372 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678397 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678315 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678478 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.678490 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.769425 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.771029 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.771076 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.771093 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[5033]: I0319 18:56:33.771135 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:33 crc kubenswrapper[5033]: E0319 18:56:33.771812 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.408235 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.684831 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0" exitCode=0 Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685101 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685144 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685114 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685236 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685305 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685112 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.686440 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.685141 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0"} Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687677 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687675 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687736 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687783 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687807 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.687894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688302 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688347 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688363 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688949 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[5033]: I0319 18:56:34.688971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.428404 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.691471 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3557e7fc840231124d9d0ad59f938cc557d38ec0fe52fc8492edb6a11560f617"} Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.691545 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d912c91e96987e8aaa14cc154d18cc611ac81125fa1f0f808703471be367105"} Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.691563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"82be8f4b8b6cbb8538c1e850b0300b0e876f3268fea10d8fa59bbf7b7ea67790"} Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.691500 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.691654 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.692592 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.692644 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.692658 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.809852 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.810052 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.811270 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.811314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.811328 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[5033]: I0319 18:56:35.818532 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.568908 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7941ed812a7b1b0a9fa3a2e84c9c55982c901493321db52723d9726281a03bb1"} Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b947a1ec11e629f395cec8f721ae101fa8a63878714be2d5b060f93dfa34267"} Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698584 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698627 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698636 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.698770 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.699870 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.699924 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.699942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.699850 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.700006 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.700015 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.700212 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.700244 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.700263 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.972295 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.974127 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.974183 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.974202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:36 crc kubenswrapper[5033]: I0319 18:56:36.974238 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.120049 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.702313 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.702373 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.703911 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.703967 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.703985 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.703917 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.704043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.704072 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:37 crc kubenswrapper[5033]: I0319 18:56:37.906039 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.361295 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.361558 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.363183 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.363239 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.363257 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.586442 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.704944 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.705096 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706370 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706403 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706416 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706487 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706563 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:38 crc kubenswrapper[5033]: I0319 18:56:38.706584 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:39 crc kubenswrapper[5033]: I0319 18:56:39.066605 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:39 crc kubenswrapper[5033]: I0319 18:56:39.067040 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:39 crc kubenswrapper[5033]: I0319 18:56:39.068985 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:39 crc kubenswrapper[5033]: I0319 18:56:39.069058 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:39 crc kubenswrapper[5033]: I0319 18:56:39.069078 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:40 crc kubenswrapper[5033]: E0319 18:56:40.670216 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:41 crc kubenswrapper[5033]: I0319 18:56:41.362439 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:56:41 crc kubenswrapper[5033]: I0319 18:56:41.362596 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.155052 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.155503 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.157931 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.158018 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.158044 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.671131 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.671284 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.672508 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.672591 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:42 crc kubenswrapper[5033]: I0319 18:56:42.672611 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:44 crc kubenswrapper[5033]: W0319 18:56:44.169847 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.170028 5033 trace.go:236] Trace[530090211]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 18:56:34.167) (total time: 10002ms): Mar 19 18:56:44 crc kubenswrapper[5033]: Trace[530090211]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:56:44.169) Mar 19 18:56:44 crc kubenswrapper[5033]: Trace[530090211]: [10.002013872s] [10.002013872s] END Mar 19 18:56:44 crc kubenswrapper[5033]: E0319 18:56:44.170064 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.543002 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 18:56:44 crc kubenswrapper[5033]: W0319 18:56:44.698315 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.698420 5033 trace.go:236] Trace[1476205193]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 18:56:34.696) (total time: 10002ms): Mar 19 18:56:44 crc kubenswrapper[5033]: Trace[1476205193]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:56:44.698) Mar 19 18:56:44 crc kubenswrapper[5033]: Trace[1476205193]: [10.002077174s] [10.002077174s] END Mar 19 18:56:44 crc kubenswrapper[5033]: E0319 18:56:44.698466 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.723811 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.725248 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3a05f948f36f9e83d9469f2b81ae2c94b19a28ca63466f44427eccae0847c75" exitCode=255 Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.725287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b3a05f948f36f9e83d9469f2b81ae2c94b19a28ca63466f44427eccae0847c75"} Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.725411 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.726200 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.726262 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.726282 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:44 crc kubenswrapper[5033]: I0319 18:56:44.727206 5033 scope.go:117] "RemoveContainer" containerID="b3a05f948f36f9e83d9469f2b81ae2c94b19a28ca63466f44427eccae0847c75" Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.095028 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.095359 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.095362 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.096428 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:45 crc kubenswrapper[5033]: W0319 18:56:45.097468 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.097541 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:45 crc kubenswrapper[5033]: W0319 18:56:45.098690 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z Mar 19 18:56:45 crc kubenswrapper[5033]: E0319 18:56:45.098787 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.112746 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.112827 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.117702 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.117835 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.435476 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]log ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]etcd ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-apiextensions-informers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/crd-informer-synced ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 18:56:45 crc kubenswrapper[5033]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/bootstrap-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-registration-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]autoregister-completion ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 18:56:45 crc kubenswrapper[5033]: livez check failed Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.435590 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.545169 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.729519 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.731487 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc"} Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.731701 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.732718 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.732784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:45 crc kubenswrapper[5033]: I0319 18:56:45.732804 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.544389 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:46Z is after 2026-02-23T05:33:13Z Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.736797 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.737529 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.739933 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" exitCode=255 Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.740020 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc"} Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.740124 5033 scope.go:117] "RemoveContainer" containerID="b3a05f948f36f9e83d9469f2b81ae2c94b19a28ca63466f44427eccae0847c75" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.740278 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.741473 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.741524 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.741536 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:46 crc kubenswrapper[5033]: I0319 18:56:46.742390 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:56:46 crc kubenswrapper[5033]: E0319 18:56:46.742824 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.120916 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.546099 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:47Z is after 2026-02-23T05:33:13Z Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.745881 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.748852 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.750018 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.750128 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.750221 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:47 crc kubenswrapper[5033]: I0319 18:56:47.751313 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:56:47 crc kubenswrapper[5033]: E0319 18:56:47.751732 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:48 crc kubenswrapper[5033]: I0319 18:56:48.549281 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:48Z is after 2026-02-23T05:33:13Z Mar 19 18:56:48 crc kubenswrapper[5033]: W0319 18:56:48.836611 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:48Z is after 2026-02-23T05:33:13Z Mar 19 18:56:48 crc kubenswrapper[5033]: E0319 18:56:48.836714 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:49 crc kubenswrapper[5033]: I0319 18:56:49.547561 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:49Z is after 2026-02-23T05:33:13Z Mar 19 18:56:50 crc kubenswrapper[5033]: W0319 18:56:50.135773 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z Mar 19 18:56:50 crc kubenswrapper[5033]: E0319 18:56:50.135867 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.434984 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.435224 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.436639 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.436674 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.436682 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.437113 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:56:50 crc kubenswrapper[5033]: E0319 18:56:50.437271 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.443145 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.545590 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z Mar 19 18:56:50 crc kubenswrapper[5033]: E0319 18:56:50.670893 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.757713 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.758798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.758854 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.758870 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:50 crc kubenswrapper[5033]: I0319 18:56:50.759629 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:56:50 crc kubenswrapper[5033]: E0319 18:56:50.759834 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.362399 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.362519 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.496548 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.498109 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.498171 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.498267 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.498380 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:51 crc kubenswrapper[5033]: E0319 18:56:51.502208 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:51Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:51 crc kubenswrapper[5033]: E0319 18:56:51.502801 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:51Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:51 crc kubenswrapper[5033]: I0319 18:56:51.547311 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:51Z is after 2026-02-23T05:33:13Z Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.195637 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.195938 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.197436 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.197533 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.197552 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.216146 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.546859 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:52Z is after 2026-02-23T05:33:13Z Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.725303 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.725544 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.727156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.727235 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.727261 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.728272 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:56:52 crc kubenswrapper[5033]: E0319 18:56:52.728650 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.761551 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.762327 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.762365 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:52 crc kubenswrapper[5033]: I0319 18:56:52.762375 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:53 crc kubenswrapper[5033]: I0319 18:56:53.547904 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:53Z is after 2026-02-23T05:33:13Z Mar 19 18:56:53 crc kubenswrapper[5033]: I0319 18:56:53.871325 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:53 crc kubenswrapper[5033]: E0319 18:56:53.877100 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:54 crc kubenswrapper[5033]: I0319 18:56:54.547074 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z Mar 19 18:56:55 crc kubenswrapper[5033]: E0319 18:56:55.100087 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:55 crc kubenswrapper[5033]: I0319 18:56:55.548118 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:55Z is after 2026-02-23T05:33:13Z Mar 19 18:56:56 crc kubenswrapper[5033]: W0319 18:56:56.103277 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z Mar 19 18:56:56 crc kubenswrapper[5033]: E0319 18:56:56.103429 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:56 crc kubenswrapper[5033]: W0319 18:56:56.473799 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z Mar 19 18:56:56 crc kubenswrapper[5033]: E0319 18:56:56.473870 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:56 crc kubenswrapper[5033]: I0319 18:56:56.544706 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z Mar 19 18:56:57 crc kubenswrapper[5033]: W0319 18:56:57.054838 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z Mar 19 18:56:57 crc kubenswrapper[5033]: E0319 18:56:57.054916 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:57 crc kubenswrapper[5033]: I0319 18:56:57.546639 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.503348 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.505325 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.505390 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.505408 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.505506 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:58 crc kubenswrapper[5033]: E0319 18:56:58.509047 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:58 crc kubenswrapper[5033]: E0319 18:56:58.511364 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:58 crc kubenswrapper[5033]: I0319 18:56:58.545513 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:58Z is after 2026-02-23T05:33:13Z Mar 19 18:56:59 crc kubenswrapper[5033]: I0319 18:56:59.547260 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:59Z is after 2026-02-23T05:33:13Z Mar 19 18:57:00 crc kubenswrapper[5033]: I0319 18:57:00.546493 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:00Z is after 2026-02-23T05:33:13Z Mar 19 18:57:00 crc kubenswrapper[5033]: E0319 18:57:00.671612 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.362020 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.362090 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.362143 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.362275 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.363605 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.363694 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.363714 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.364941 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.365303 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1" gracePeriod=30 Mar 19 18:57:01 crc kubenswrapper[5033]: W0319 18:57:01.539579 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:01Z is after 2026-02-23T05:33:13Z Mar 19 18:57:01 crc kubenswrapper[5033]: E0319 18:57:01.539715 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.547543 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:01Z is after 2026-02-23T05:33:13Z Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.790999 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.791530 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1" exitCode=255 Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.791568 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1"} Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.791594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45d52185f8fc5e49db2530b2365e087ecec35fb5aacf6b470e896fc7b2ba45d2"} Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.791683 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.793003 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.793491 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:01 crc kubenswrapper[5033]: I0319 18:57:01.793506 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:02 crc kubenswrapper[5033]: I0319 18:57:02.544961 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:02Z is after 2026-02-23T05:33:13Z Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.545694 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:03Z is after 2026-02-23T05:33:13Z Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.619980 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.621234 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.621319 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.621336 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:03 crc kubenswrapper[5033]: I0319 18:57:03.622030 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.546834 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:04Z is after 2026-02-23T05:33:13Z Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.802531 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.803098 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.805780 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" exitCode=255 Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.805849 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3"} Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.805938 5033 scope.go:117] "RemoveContainer" containerID="5d8177b2703e0094c6afc383d3a4df05ca8a2b428627435ba33a0aa93da25fdc" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.806123 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.807540 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.807589 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.807602 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:04 crc kubenswrapper[5033]: I0319 18:57:04.808243 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:04 crc kubenswrapper[5033]: E0319 18:57:04.808440 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:05 crc kubenswrapper[5033]: E0319 18:57:05.105350 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.511689 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.513637 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.513708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.513729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.513776 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:05 crc kubenswrapper[5033]: E0319 18:57:05.515198 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:57:05 crc kubenswrapper[5033]: E0319 18:57:05.516863 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.545291 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:05Z is after 2026-02-23T05:33:13Z Mar 19 18:57:05 crc kubenswrapper[5033]: I0319 18:57:05.809853 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:06 crc kubenswrapper[5033]: I0319 18:57:06.545325 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:06Z is after 2026-02-23T05:33:13Z Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.120609 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.120860 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.123438 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.123566 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.123591 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.125050 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:07 crc kubenswrapper[5033]: E0319 18:57:07.125345 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:07 crc kubenswrapper[5033]: I0319 18:57:07.545997 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:07Z is after 2026-02-23T05:33:13Z Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.362079 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.362275 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.363560 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.363597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.363605 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.544126 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:08Z is after 2026-02-23T05:33:13Z Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.586534 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.820499 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.821730 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.821790 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:08 crc kubenswrapper[5033]: I0319 18:57:08.821848 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:09 crc kubenswrapper[5033]: W0319 18:57:09.179721 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:09Z is after 2026-02-23T05:33:13Z Mar 19 18:57:09 crc kubenswrapper[5033]: E0319 18:57:09.179831 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:57:09 crc kubenswrapper[5033]: I0319 18:57:09.545539 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:09Z is after 2026-02-23T05:33:13Z Mar 19 18:57:10 crc kubenswrapper[5033]: I0319 18:57:10.546265 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:10Z is after 2026-02-23T05:33:13Z Mar 19 18:57:10 crc kubenswrapper[5033]: I0319 18:57:10.625307 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:57:10 crc kubenswrapper[5033]: E0319 18:57:10.629593 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:57:10 crc kubenswrapper[5033]: E0319 18:57:10.630848 5033 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 18:57:10 crc kubenswrapper[5033]: E0319 18:57:10.672135 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:11 crc kubenswrapper[5033]: I0319 18:57:11.363016 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:57:11 crc kubenswrapper[5033]: I0319 18:57:11.363161 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:57:11 crc kubenswrapper[5033]: I0319 18:57:11.544225 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:11Z is after 2026-02-23T05:33:13Z Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.517751 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.519708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.519776 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.519799 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.519849 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:12 crc kubenswrapper[5033]: E0319 18:57:12.521355 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:57:12 crc kubenswrapper[5033]: E0319 18:57:12.524775 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.547946 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:12Z is after 2026-02-23T05:33:13Z Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.725651 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.725867 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.727148 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.727192 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.727202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:12 crc kubenswrapper[5033]: I0319 18:57:12.727680 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:12 crc kubenswrapper[5033]: E0319 18:57:12.727836 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:13 crc kubenswrapper[5033]: I0319 18:57:13.547158 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:13Z is after 2026-02-23T05:33:13Z Mar 19 18:57:14 crc kubenswrapper[5033]: I0319 18:57:14.545076 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:14Z is after 2026-02-23T05:33:13Z Mar 19 18:57:14 crc kubenswrapper[5033]: W0319 18:57:14.996641 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:14Z is after 2026-02-23T05:33:13Z Mar 19 18:57:14 crc kubenswrapper[5033]: E0319 18:57:14.996742 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:57:15 crc kubenswrapper[5033]: E0319 18:57:15.112021 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:15 crc kubenswrapper[5033]: W0319 18:57:15.378579 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:15Z is after 2026-02-23T05:33:13Z Mar 19 18:57:15 crc kubenswrapper[5033]: E0319 18:57:15.378698 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:57:15 crc kubenswrapper[5033]: I0319 18:57:15.545232 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:15Z is after 2026-02-23T05:33:13Z Mar 19 18:57:16 crc kubenswrapper[5033]: I0319 18:57:16.545814 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:16Z is after 2026-02-23T05:33:13Z Mar 19 18:57:17 crc kubenswrapper[5033]: I0319 18:57:17.547542 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:17Z is after 2026-02-23T05:33:13Z Mar 19 18:57:18 crc kubenswrapper[5033]: I0319 18:57:18.547658 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:18Z is after 2026-02-23T05:33:13Z Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.076583 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.076855 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.078901 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.078980 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.079002 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.525780 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.527140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.527179 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.527190 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.527217 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:19 crc kubenswrapper[5033]: E0319 18:57:19.528874 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:57:19 crc kubenswrapper[5033]: E0319 18:57:19.531075 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:57:19 crc kubenswrapper[5033]: I0319 18:57:19.547590 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:19Z is after 2026-02-23T05:33:13Z Mar 19 18:57:20 crc kubenswrapper[5033]: I0319 18:57:20.545387 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:20Z is after 2026-02-23T05:33:13Z Mar 19 18:57:20 crc kubenswrapper[5033]: E0319 18:57:20.672255 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:21 crc kubenswrapper[5033]: I0319 18:57:21.362502 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:57:21 crc kubenswrapper[5033]: I0319 18:57:21.362617 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:57:21 crc kubenswrapper[5033]: I0319 18:57:21.544965 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:21Z is after 2026-02-23T05:33:13Z Mar 19 18:57:22 crc kubenswrapper[5033]: W0319 18:57:22.297761 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 18:57:22 crc kubenswrapper[5033]: E0319 18:57:22.297816 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 18:57:22 crc kubenswrapper[5033]: I0319 18:57:22.551154 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.547104 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.619699 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.620784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.620821 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.620831 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:23 crc kubenswrapper[5033]: I0319 18:57:23.621263 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:23 crc kubenswrapper[5033]: E0319 18:57:23.621423 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:24 crc kubenswrapper[5033]: I0319 18:57:24.548681 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.119337 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067b954a6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,LastTimestamp:2026-03-19 18:56:30.538844778 +0000 UTC m=+0.643874627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.126148 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.133371 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.140548 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.147695 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e530682919dd4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.6560445 +0000 UTC m=+0.761074359,LastTimestamp:2026-03-19 18:56:30.6560445 +0000 UTC m=+0.761074359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.155061 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.722706932 +0000 UTC m=+0.827736791,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.161819 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.722728412 +0000 UTC m=+0.827758271,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.169610 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.722739573 +0000 UTC m=+0.827769432,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.177344 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.723900326 +0000 UTC m=+0.828930195,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.184584 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.723933457 +0000 UTC m=+0.828963316,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.191538 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.723949468 +0000 UTC m=+0.828979337,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.199400 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.724134533 +0000 UTC m=+0.829164392,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.207774 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.724153384 +0000 UTC m=+0.829183243,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.215975 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.724167264 +0000 UTC m=+0.829197123,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.223026 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.724964087 +0000 UTC m=+0.829993946,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.228218 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.724977908 +0000 UTC m=+0.830007767,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.233004 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.724992358 +0000 UTC m=+0.830022217,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.239067 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.725960996 +0000 UTC m=+0.830990855,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.244539 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.725979987 +0000 UTC m=+0.831009846,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.248633 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.725990977 +0000 UTC m=+0.831020836,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.252913 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.726127101 +0000 UTC m=+0.831156960,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.260169 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.726141242 +0000 UTC m=+0.831171101,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.265201 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab7f27\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab7f27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590631719 +0000 UTC m=+0.695661628,LastTimestamp:2026-03-19 18:56:30.726151352 +0000 UTC m=+0.831181211,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.272443 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067ea9abea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067ea9abea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590512106 +0000 UTC m=+0.695542015,LastTimestamp:2026-03-19 18:56:30.726627046 +0000 UTC m=+0.831656935,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.277355 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53067eab1176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53067eab1176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.590603638 +0000 UTC m=+0.695633547,LastTimestamp:2026-03-19 18:56:30.726663627 +0000 UTC m=+0.831693516,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.286107 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53069d361032 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.103029298 +0000 UTC m=+1.208059157,LastTimestamp:2026-03-19 18:56:31.103029298 +0000 UTC m=+1.208059157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.292961 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53069e5012b7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.121511095 +0000 UTC m=+1.226540984,LastTimestamp:2026-03-19 18:56:31.121511095 +0000 UTC m=+1.226540984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.298478 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53069eb4cfbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.128113087 +0000 UTC m=+1.233142936,LastTimestamp:2026-03-19 18:56:31.128113087 +0000 UTC m=+1.233142936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.302633 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53069f0f4ae1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.134042849 +0000 UTC m=+1.239072698,LastTimestamp:2026-03-19 18:56:31.134042849 +0000 UTC m=+1.239072698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.307159 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e53069f6b1cea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.140060394 +0000 UTC m=+1.245090273,LastTimestamp:2026-03-19 18:56:31.140060394 +0000 UTC m=+1.245090273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.312928 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306beaa1c81 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.664282753 +0000 UTC m=+1.769312622,LastTimestamp:2026-03-19 18:56:31.664282753 +0000 UTC m=+1.769312622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.314933 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5306beae980a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.664576522 +0000 UTC m=+1.769606391,LastTimestamp:2026-03-19 18:56:31.664576522 +0000 UTC m=+1.769606391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.317770 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5306beaf8315 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.664636693 +0000 UTC m=+1.769666532,LastTimestamp:2026-03-19 18:56:31.664636693 +0000 UTC m=+1.769666532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.322152 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e5306beaf8f82 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.664639874 +0000 UTC m=+1.769669723,LastTimestamp:2026-03-19 18:56:31.664639874 +0000 UTC m=+1.769669723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.328277 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5306bf2f5495 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.673013397 +0000 UTC m=+1.778043266,LastTimestamp:2026-03-19 18:56:31.673013397 +0000 UTC m=+1.778043266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.333582 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306bf402605 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.674115589 +0000 UTC m=+1.779145438,LastTimestamp:2026-03-19 18:56:31.674115589 +0000 UTC m=+1.779145438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.338917 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306bf5ec8b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.676123317 +0000 UTC m=+1.781153176,LastTimestamp:2026-03-19 18:56:31.676123317 +0000 UTC m=+1.781153176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.344129 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5306bf7cae9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.678082714 +0000 UTC m=+1.783112573,LastTimestamp:2026-03-19 18:56:31.678082714 +0000 UTC m=+1.783112573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.351791 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5306bf9aade3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.680048611 +0000 UTC m=+1.785078470,LastTimestamp:2026-03-19 18:56:31.680048611 +0000 UTC m=+1.785078470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.357889 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e5306bf9dbfb1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.680249777 +0000 UTC m=+1.785279626,LastTimestamp:2026-03-19 18:56:31.680249777 +0000 UTC m=+1.785279626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.365347 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5306c04acb75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.691590517 +0000 UTC m=+1.796620386,LastTimestamp:2026-03-19 18:56:31.691590517 +0000 UTC m=+1.796620386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.373108 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306d176ff0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.979699981 +0000 UTC m=+2.084729860,LastTimestamp:2026-03-19 18:56:31.979699981 +0000 UTC m=+2.084729860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.378744 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306d21ac27f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.990432383 +0000 UTC m=+2.095462232,LastTimestamp:2026-03-19 18:56:31.990432383 +0000 UTC m=+2.095462232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.385383 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306d22b51af openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.991517615 +0000 UTC m=+2.096547484,LastTimestamp:2026-03-19 18:56:31.991517615 +0000 UTC m=+2.096547484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.391497 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306e0361f99 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.227106713 +0000 UTC m=+2.332136562,LastTimestamp:2026-03-19 18:56:32.227106713 +0000 UTC m=+2.332136562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.397761 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306e104529a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.240620186 +0000 UTC m=+2.345650075,LastTimestamp:2026-03-19 18:56:32.240620186 +0000 UTC m=+2.345650075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.403366 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306e1248abc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.242731708 +0000 UTC m=+2.347761557,LastTimestamp:2026-03-19 18:56:32.242731708 +0000 UTC m=+2.347761557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.408592 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306ef0216a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.47535479 +0000 UTC m=+2.580384669,LastTimestamp:2026-03-19 18:56:32.47535479 +0000 UTC m=+2.580384669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.414405 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306efe1b4f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.490009846 +0000 UTC m=+2.595039745,LastTimestamp:2026-03-19 18:56:32.490009846 +0000 UTC m=+2.595039745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.420005 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5306f900f109 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.643051785 +0000 UTC m=+2.748081644,LastTimestamp:2026-03-19 18:56:32.643051785 +0000 UTC m=+2.748081644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.431144 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5306f94ef573 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.648164723 +0000 UTC m=+2.753194602,LastTimestamp:2026-03-19 18:56:32.648164723 +0000 UTC m=+2.753194602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.437699 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e5306f965b18b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.649654667 +0000 UTC m=+2.754684546,LastTimestamp:2026-03-19 18:56:32.649654667 +0000 UTC m=+2.754684546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.444293 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5306f9b619d0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.65492424 +0000 UTC m=+2.759954099,LastTimestamp:2026-03-19 18:56:32.65492424 +0000 UTC m=+2.759954099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.449674 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530706b99a7a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.873257594 +0000 UTC m=+2.978287443,LastTimestamp:2026-03-19 18:56:32.873257594 +0000 UTC m=+2.978287443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.456080 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530706c56e21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.874032673 +0000 UTC m=+2.979062522,LastTimestamp:2026-03-19 18:56:32.874032673 +0000 UTC m=+2.979062522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.462260 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5307079ca32c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.888136492 +0000 UTC m=+2.993166341,LastTimestamp:2026-03-19 18:56:32.888136492 +0000 UTC m=+2.993166341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.467773 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530707a5db27 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.888740647 +0000 UTC m=+2.993770496,LastTimestamp:2026-03-19 18:56:32.888740647 +0000 UTC m=+2.993770496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.472337 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530707b43850 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.889682 +0000 UTC m=+2.994711849,LastTimestamp:2026-03-19 18:56:32.889682 +0000 UTC m=+2.994711849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.476636 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530707ebdcce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.89332859 +0000 UTC m=+2.998358439,LastTimestamp:2026-03-19 18:56:32.89332859 +0000 UTC m=+2.998358439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.482582 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e530707f140b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.893681849 +0000 UTC m=+2.998711698,LastTimestamp:2026-03-19 18:56:32.893681849 +0000 UTC m=+2.998711698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.488730 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53070812c409 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.895878153 +0000 UTC m=+3.000908002,LastTimestamp:2026-03-19 18:56:32.895878153 +0000 UTC m=+3.000908002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.494903 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5307087d8cba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.902876346 +0000 UTC m=+3.007906195,LastTimestamp:2026-03-19 18:56:32.902876346 +0000 UTC m=+3.007906195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.499579 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e530709d0aec5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:32.925101765 +0000 UTC m=+3.030131614,LastTimestamp:2026-03-19 18:56:32.925101765 +0000 UTC m=+3.030131614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.505656 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530712562667 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.068066407 +0000 UTC m=+3.173096276,LastTimestamp:2026-03-19 18:56:33.068066407 +0000 UTC m=+3.173096276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.511831 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e530712a06b0f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.072933647 +0000 UTC m=+3.177963496,LastTimestamp:2026-03-19 18:56:33.072933647 +0000 UTC m=+3.177963496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.519311 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530713854ca8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.087933608 +0000 UTC m=+3.192963467,LastTimestamp:2026-03-19 18:56:33.087933608 +0000 UTC m=+3.192963467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.522959 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530713956148 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.088987464 +0000 UTC m=+3.194017313,LastTimestamp:2026-03-19 18:56:33.088987464 +0000 UTC m=+3.194017313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.526698 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e530713b03c19 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.090747417 +0000 UTC m=+3.195777266,LastTimestamp:2026-03-19 18:56:33.090747417 +0000 UTC m=+3.195777266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.530121 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e530713c5a34c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.092150092 +0000 UTC m=+3.197179961,LastTimestamp:2026-03-19 18:56:33.092150092 +0000 UTC m=+3.197179961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.534171 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53071ddb0764 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.261324132 +0000 UTC m=+3.366353981,LastTimestamp:2026-03-19 18:56:33.261324132 +0000 UTC m=+3.366353981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.537545 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53071df37c53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.262926931 +0000 UTC m=+3.367956780,LastTimestamp:2026-03-19 18:56:33.262926931 +0000 UTC m=+3.367956780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.540222 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53071f0226a1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.280665249 +0000 UTC m=+3.385695098,LastTimestamp:2026-03-19 18:56:33.280665249 +0000 UTC m=+3.385695098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: I0319 18:57:25.543839 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.544223 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53071f20cae3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.282673379 +0000 UTC m=+3.387703228,LastTimestamp:2026-03-19 18:56:33.282673379 +0000 UTC m=+3.387703228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.548347 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53071f41d10b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.284837643 +0000 UTC m=+3.389867492,LastTimestamp:2026-03-19 18:56:33.284837643 +0000 UTC m=+3.389867492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.550489 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5307296b9c27 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.455348775 +0000 UTC m=+3.560378624,LastTimestamp:2026-03-19 18:56:33.455348775 +0000 UTC m=+3.560378624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.553403 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53072a0ad65f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.465783903 +0000 UTC m=+3.570813752,LastTimestamp:2026-03-19 18:56:33.465783903 +0000 UTC m=+3.570813752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.557076 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53072a191ed4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.466719956 +0000 UTC m=+3.571749805,LastTimestamp:2026-03-19 18:56:33.466719956 +0000 UTC m=+3.571749805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.559574 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5307330cb14a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.616900426 +0000 UTC m=+3.721930275,LastTimestamp:2026-03-19 18:56:33.616900426 +0000 UTC m=+3.721930275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.563604 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530733d6576d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.630115693 +0000 UTC m=+3.735145542,LastTimestamp:2026-03-19 18:56:33.630115693 +0000 UTC m=+3.735145542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.569270 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530736d4f90c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.680357644 +0000 UTC m=+3.785387493,LastTimestamp:2026-03-19 18:56:33.680357644 +0000 UTC m=+3.785387493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.573789 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530742a1cadb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.878330075 +0000 UTC m=+3.983359954,LastTimestamp:2026-03-19 18:56:33.878330075 +0000 UTC m=+3.983359954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.579061 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530743c7c105 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.897595141 +0000 UTC m=+4.002624990,LastTimestamp:2026-03-19 18:56:33.897595141 +0000 UTC m=+4.002624990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.586142 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307731fcbb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:34.691894195 +0000 UTC m=+4.796924074,LastTimestamp:2026-03-19 18:56:34.691894195 +0000 UTC m=+4.796924074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.590777 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530783c09ac1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:34.970868417 +0000 UTC m=+5.075898306,LastTimestamp:2026-03-19 18:56:34.970868417 +0000 UTC m=+5.075898306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.595212 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530784943320 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:34.98473552 +0000 UTC m=+5.089765399,LastTimestamp:2026-03-19 18:56:34.98473552 +0000 UTC m=+5.089765399,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.599948 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530784ad07ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:34.98636286 +0000 UTC m=+5.091392739,LastTimestamp:2026-03-19 18:56:34.98636286 +0000 UTC m=+5.091392739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.604329 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530793e2219b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.241501083 +0000 UTC m=+5.346530972,LastTimestamp:2026-03-19 18:56:35.241501083 +0000 UTC m=+5.346530972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.608504 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307952fd8fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.263371514 +0000 UTC m=+5.368401403,LastTimestamp:2026-03-19 18:56:35.263371514 +0000 UTC m=+5.368401403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.613188 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530795485681 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.264976513 +0000 UTC m=+5.370006392,LastTimestamp:2026-03-19 18:56:35.264976513 +0000 UTC m=+5.370006392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.619268 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307a589ec00 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.53771008 +0000 UTC m=+5.642739969,LastTimestamp:2026-03-19 18:56:35.53771008 +0000 UTC m=+5.642739969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.623627 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307a6845f24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.554123556 +0000 UTC m=+5.659153415,LastTimestamp:2026-03-19 18:56:35.554123556 +0000 UTC m=+5.659153415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.629584 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307a6968e23 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.555315235 +0000 UTC m=+5.660345084,LastTimestamp:2026-03-19 18:56:35.555315235 +0000 UTC m=+5.660345084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.635927 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307b559fcab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.803004075 +0000 UTC m=+5.908033934,LastTimestamp:2026-03-19 18:56:35.803004075 +0000 UTC m=+5.908033934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.643071 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307b6490726 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.818669862 +0000 UTC m=+5.923699721,LastTimestamp:2026-03-19 18:56:35.818669862 +0000 UTC m=+5.923699721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.647938 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307b65f2c4f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:35.820121167 +0000 UTC m=+5.925151026,LastTimestamp:2026-03-19 18:56:35.820121167 +0000 UTC m=+5.925151026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.655029 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307c3bd9419 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:36.044411929 +0000 UTC m=+6.149441768,LastTimestamp:2026-03-19 18:56:36.044411929 +0000 UTC m=+6.149441768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.662619 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5307c4599753 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:36.054636371 +0000 UTC m=+6.159666250,LastTimestamp:2026-03-19 18:56:36.054636371 +0000 UTC m=+6.159666250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.672385 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530900ba19ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:25 crc kubenswrapper[5033]: body: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:41.362561451 +0000 UTC m=+11.467591340,LastTimestamp:2026-03-19 18:56:41.362561451 +0000 UTC m=+11.467591340,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.677497 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530900bb8e2d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:41.362656813 +0000 UTC m=+11.467686712,LastTimestamp:2026-03-19 18:56:41.362656813 +0000 UTC m=+11.467686712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.683752 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e53072a191ed4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53072a191ed4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.466719956 +0000 UTC m=+3.571749805,LastTimestamp:2026-03-19 18:56:44.728385553 +0000 UTC m=+14.833415412,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.688481 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5307330cb14a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5307330cb14a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.616900426 +0000 UTC m=+3.721930275,LastTimestamp:2026-03-19 18:56:44.91314857 +0000 UTC m=+15.018178419,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.693210 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e530733d6576d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530733d6576d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:33.630115693 +0000 UTC m=+3.735145542,LastTimestamp:2026-03-19 18:56:44.934096964 +0000 UTC m=+15.039126833,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.698197 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-apiserver-crc.189e5309e04241f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 18:57:25 crc kubenswrapper[5033]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:57:25 crc kubenswrapper[5033]: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:45.112803826 +0000 UTC m=+15.217833715,LastTimestamp:2026-03-19 18:56:45.112803826 +0000 UTC m=+15.217833715,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.702661 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5309e0432088 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:45.112860808 +0000 UTC m=+15.217890697,LastTimestamp:2026-03-19 18:56:45.112860808 +0000 UTC m=+15.217890697,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.708921 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5309e04241f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-apiserver-crc.189e5309e04241f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 18:57:25 crc kubenswrapper[5033]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:57:25 crc kubenswrapper[5033]: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:45.112803826 +0000 UTC m=+15.217833715,LastTimestamp:2026-03-19 18:56:45.11778797 +0000 UTC m=+15.222817859,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.714421 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5309e0432088\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5309e0432088 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:45.112860808 +0000 UTC m=+15.217890697,LastTimestamp:2026-03-19 18:56:45.117925784 +0000 UTC m=+15.222955673,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.720985 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c4d383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:25 crc kubenswrapper[5033]: body: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362485123 +0000 UTC m=+21.467514982,LastTimestamp:2026-03-19 18:56:51.362485123 +0000 UTC m=+21.467514982,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.728273 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c5c44d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362546765 +0000 UTC m=+21.467576624,LastTimestamp:2026-03-19 18:56:51.362546765 +0000 UTC m=+21.467576624,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.734599 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e530b54c4d383\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c4d383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:25 crc kubenswrapper[5033]: body: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362485123 +0000 UTC m=+21.467514982,LastTimestamp:2026-03-19 18:57:01.362072695 +0000 UTC m=+31.467102544,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.740293 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e530b54c5c44d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c5c44d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362546765 +0000 UTC m=+21.467576624,LastTimestamp:2026-03-19 18:57:01.362116417 +0000 UTC m=+31.467146266,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.746595 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530da8fb419c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:57:01.365272988 +0000 UTC m=+31.470302867,LastTimestamp:2026-03-19 18:57:01.365272988 +0000 UTC m=+31.470302867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.752828 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e5306bf5ec8b5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306bf5ec8b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.676123317 +0000 UTC m=+1.781153176,LastTimestamp:2026-03-19 18:57:01.482400065 +0000 UTC m=+31.587429944,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.757588 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e5306d176ff0d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306d176ff0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.979699981 +0000 UTC m=+2.084729860,LastTimestamp:2026-03-19 18:57:01.697682622 +0000 UTC m=+31.802712471,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.762328 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e5306d21ac27f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5306d21ac27f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:31.990432383 +0000 UTC m=+2.095462232,LastTimestamp:2026-03-19 18:57:01.707398742 +0000 UTC m=+31.812428601,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.770357 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e530b54c4d383\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c4d383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:25 crc kubenswrapper[5033]: body: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362485123 +0000 UTC m=+21.467514982,LastTimestamp:2026-03-19 18:57:11.363135871 +0000 UTC m=+41.468165810,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.774756 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e530b54c5c44d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c5c44d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362546765 +0000 UTC m=+21.467576624,LastTimestamp:2026-03-19 18:57:11.363231104 +0000 UTC m=+41.468260983,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:25 crc kubenswrapper[5033]: E0319 18:57:25.781984 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e530b54c4d383\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:25 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530b54c4d383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:25 crc kubenswrapper[5033]: body: Mar 19 18:57:25 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:51.362485123 +0000 UTC m=+21.467514982,LastTimestamp:2026-03-19 18:57:21.362562577 +0000 UTC m=+51.467592466,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:25 crc kubenswrapper[5033]: > Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.531721 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.533359 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.533438 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.533490 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.533526 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:26 crc kubenswrapper[5033]: E0319 18:57:26.534953 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:26 crc kubenswrapper[5033]: E0319 18:57:26.535144 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:26 crc kubenswrapper[5033]: I0319 18:57:26.543386 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:27 crc kubenswrapper[5033]: I0319 18:57:27.547082 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:28 crc kubenswrapper[5033]: I0319 18:57:28.548658 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:29 crc kubenswrapper[5033]: I0319 18:57:29.546073 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:30 crc kubenswrapper[5033]: I0319 18:57:30.548538 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:30 crc kubenswrapper[5033]: E0319 18:57:30.672402 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.361866 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.361986 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.362080 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.362315 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.365442 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.365570 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.365597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.367003 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"45d52185f8fc5e49db2530b2365e087ecec35fb5aacf6b470e896fc7b2ba45d2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.367286 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://45d52185f8fc5e49db2530b2365e087ecec35fb5aacf6b470e896fc7b2ba45d2" gracePeriod=30 Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.547491 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.884955 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.885260 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="45d52185f8fc5e49db2530b2365e087ecec35fb5aacf6b470e896fc7b2ba45d2" exitCode=0 Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.885288 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"45d52185f8fc5e49db2530b2365e087ecec35fb5aacf6b470e896fc7b2ba45d2"} Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.885311 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b69e7842478b1409113614b6b0539bf9deb00062fa7061214122537f45a04d1"} Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.885327 5033 scope.go:117] "RemoveContainer" containerID="caeef98686ed0a2c710fad3aaf09ff144f0c18d3b8a4dc6538a07417980381f1" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.885440 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.886400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.886425 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:31 crc kubenswrapper[5033]: I0319 18:57:31.886433 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:32 crc kubenswrapper[5033]: I0319 18:57:32.546866 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.537386 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.539815 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.539877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.539897 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.539953 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:33 crc kubenswrapper[5033]: I0319 18:57:33.547438 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:33 crc kubenswrapper[5033]: E0319 18:57:33.547567 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:33 crc kubenswrapper[5033]: E0319 18:57:33.551791 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:34 crc kubenswrapper[5033]: I0319 18:57:34.548822 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.547583 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.619746 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.620956 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.620997 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.621006 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.621496 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.900899 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.902687 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e"} Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.902859 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.903693 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.903722 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:35 crc kubenswrapper[5033]: I0319 18:57:35.903731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.545988 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.907015 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.907504 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.909389 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" exitCode=255 Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.909434 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e"} Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.909490 5033 scope.go:117] "RemoveContainer" containerID="a5e486ed27e9039af6340dcc613105417de3dffeb325e74de6323dc416f7a6f3" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.909825 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.915306 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.915373 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.915401 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:36 crc kubenswrapper[5033]: I0319 18:57:36.916498 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:57:36 crc kubenswrapper[5033]: E0319 18:57:36.917591 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.120996 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.545199 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.915233 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.919197 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.920144 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.920217 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.920236 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:37 crc kubenswrapper[5033]: I0319 18:57:37.921034 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:57:37 crc kubenswrapper[5033]: E0319 18:57:37.921364 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:38 crc kubenswrapper[5033]: W0319 18:57:38.254040 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:38 crc kubenswrapper[5033]: E0319 18:57:38.254106 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.361517 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.361692 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.362842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.362883 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.362892 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.366819 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.545031 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.586631 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.923787 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.924545 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.924641 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:38 crc kubenswrapper[5033]: I0319 18:57:38.924705 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[5033]: I0319 18:57:39.545780 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:39 crc kubenswrapper[5033]: I0319 18:57:39.925941 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:39 crc kubenswrapper[5033]: I0319 18:57:39.926896 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[5033]: I0319 18:57:39.926941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[5033]: I0319 18:57:39.926955 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.547676 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.548537 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.549393 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.549427 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.549436 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:40 crc kubenswrapper[5033]: I0319 18:57:40.549477 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:40 crc kubenswrapper[5033]: E0319 18:57:40.552891 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:40 crc kubenswrapper[5033]: E0319 18:57:40.553495 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:40 crc kubenswrapper[5033]: E0319 18:57:40.672528 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[5033]: I0319 18:57:41.548571 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.545407 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.632594 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.650618 5033 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.725569 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.725735 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.726711 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.726740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.726749 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[5033]: I0319 18:57:42.727210 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:57:42 crc kubenswrapper[5033]: E0319 18:57:42.727356 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:43 crc kubenswrapper[5033]: I0319 18:57:43.546400 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:44 crc kubenswrapper[5033]: I0319 18:57:44.545292 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:45 crc kubenswrapper[5033]: I0319 18:57:45.546893 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:45 crc kubenswrapper[5033]: I0319 18:57:45.623397 5033 csr.go:261] certificate signing request csr-8mcx4 is approved, waiting to be issued Mar 19 18:57:45 crc kubenswrapper[5033]: I0319 18:57:45.631585 5033 csr.go:257] certificate signing request csr-8mcx4 is issued Mar 19 18:57:45 crc kubenswrapper[5033]: I0319 18:57:45.736021 5033 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 18:57:46 crc kubenswrapper[5033]: I0319 18:57:46.385052 5033 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 18:57:46 crc kubenswrapper[5033]: I0319 18:57:46.633114 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 11:33:28.910041479 +0000 UTC Mar 19 18:57:46 crc kubenswrapper[5033]: I0319 18:57:46.633182 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6016h35m42.276864625s for next certificate rotation Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.553834 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.555739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.555850 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.555878 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.556088 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.566835 5033 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.567180 5033 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.567213 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.572599 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.572682 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.572701 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.572720 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.572735 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.593666 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.604437 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.604546 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.604571 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.604597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.604615 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.621410 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.633119 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.633175 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.633194 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.633220 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.633242 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.650058 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.663135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.663186 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.663201 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.663225 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[5033]: I0319 18:57:47.663239 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.675319 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.675597 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.675633 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.776163 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.876873 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:47 crc kubenswrapper[5033]: E0319 18:57:47.977075 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.077834 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.178677 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.279078 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.379916 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.464296 5033 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.480500 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.580926 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.590732 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.590850 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.591740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.591788 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[5033]: I0319 18:57:48.591805 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.681861 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.782852 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.883688 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:48 crc kubenswrapper[5033]: E0319 18:57:48.984430 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.084675 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.185574 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.285917 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.386096 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.486594 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.587432 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.687698 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.788520 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.889382 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:49 crc kubenswrapper[5033]: E0319 18:57:49.989985 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.090770 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.191481 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.292255 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.393438 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.493991 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.594153 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.672996 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.694534 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.795025 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.895985 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:50 crc kubenswrapper[5033]: E0319 18:57:50.996835 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.097035 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.198092 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.299257 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.399835 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.500254 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.601326 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: I0319 18:57:51.619909 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:51 crc kubenswrapper[5033]: I0319 18:57:51.622126 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:51 crc kubenswrapper[5033]: I0319 18:57:51.622195 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:51 crc kubenswrapper[5033]: I0319 18:57:51.622223 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.702420 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.802880 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:51 crc kubenswrapper[5033]: E0319 18:57:51.903825 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.004265 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.105062 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.205899 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.306975 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.407594 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.445880 5033 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.510500 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.510588 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.510623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.510660 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.510689 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:52Z","lastTransitionTime":"2026-03-19T18:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.573118 5033 apiserver.go:52] "Watching apiserver" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.580332 5033 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.580830 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.581499 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.581547 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.581765 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.581929 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.582114 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.582168 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.582414 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.582728 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.582830 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.585758 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.585944 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.587687 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.587782 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.587936 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.587975 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.588119 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.589182 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.589487 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.614022 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.614106 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.614132 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.614169 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.614203 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:52Z","lastTransitionTime":"2026-03-19T18:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.635284 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.645298 5033 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.657786 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.676860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.676984 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677029 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677068 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677103 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677142 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677180 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677228 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677267 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677312 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677352 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677390 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677510 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677558 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677595 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677672 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677725 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677779 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677821 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677893 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677928 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.677966 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678057 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678090 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678120 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678144 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678172 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678200 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678178 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678227 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678383 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678575 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678645 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678405 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678702 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678755 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678815 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678873 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678941 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679046 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679145 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679199 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679251 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679309 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679370 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679427 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679673 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679729 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679789 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679844 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679977 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680020 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680061 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680098 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680135 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680170 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680204 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680279 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680313 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680352 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680436 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680530 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680569 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680805 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680867 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681043 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681080 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681117 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681165 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681218 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681270 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681319 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681378 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681435 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681525 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681584 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681692 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681744 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681799 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681878 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681928 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681979 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682031 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682083 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682148 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682197 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682297 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682398 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682490 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682542 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682589 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682690 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682806 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682859 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682910 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682959 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683017 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683065 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683118 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683169 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683226 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683286 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683340 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683379 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683417 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683546 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683600 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683604 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683650 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684007 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684061 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684148 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684189 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684280 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684321 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684488 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678872 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.678919 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684561 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.679984 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680288 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680505 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.680884 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681263 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681303 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681332 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681573 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681799 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.681799 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682038 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682306 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684756 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682379 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682840 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.682921 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683036 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683190 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.683664 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684760 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684808 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684969 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685091 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.684545 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685146 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685206 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685226 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685246 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685264 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685284 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685303 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685320 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685343 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685383 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685400 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685420 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685484 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685520 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685539 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685557 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685592 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685613 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685652 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685669 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685685 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685707 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685726 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685744 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685770 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685790 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685808 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685826 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685876 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685895 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685917 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685934 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685954 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686227 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686251 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686271 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686310 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686380 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686401 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686419 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686439 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686496 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686534 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686578 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686601 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686664 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686704 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686807 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686830 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685103 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685294 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685436 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685509 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685743 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685887 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686982 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.686560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685920 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.685925 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.687190 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.687297 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.687563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.687750 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.688100 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.688151 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.688222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.688355 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.688570 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.689634 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.689760 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.689831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.689959 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.690090 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.690178 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.690367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.690800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.690951 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.691566 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692236 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692308 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692356 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692356 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692395 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.692663 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.693206 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.693120 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.693365 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.694027 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.694214 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.694338 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.694912 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695043 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695243 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695342 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695843 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.695808 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.696728 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.696941 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.696978 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.697489 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.697648 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.697716 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.698397 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.699233 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.699811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.699828 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700028 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700389 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700575 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700840 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700885 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700974 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.700992 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.701595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.701711 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.701775 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.701831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.701913 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.702147 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.702237 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.702563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.702645 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.702660 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703177 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703278 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703677 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703872 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703862 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.703979 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.704005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.704317 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.704685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.704809 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705022 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705168 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705231 5033 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705314 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705388 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.705702 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.706080 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.706446 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.706692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.706875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.706884 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707089 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707585 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707812 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707864 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.707941 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.708204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.708263 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.708400 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.708771 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.708792 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.708940 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:53.208892333 +0000 UTC m=+83.313922222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.709307 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.709601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.710187 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.710709 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.711078 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712283 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712435 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712436 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.712638 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712652 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.712806 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:53.212762485 +0000 UTC m=+83.317792544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712876 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.712958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.713364 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.713414 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.713604 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:53.213538401 +0000 UTC m=+83.318568460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.713815 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.714103 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.714921 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.715331 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.715843 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.716854 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717392 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717495 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717532 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717563 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717593 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717626 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717657 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717687 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717720 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717750 5033 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717778 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717807 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717837 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717863 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717892 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717925 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717955 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.717984 5033 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718013 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718041 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718070 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718101 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718127 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718154 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718181 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718207 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718235 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718263 5033 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718293 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718322 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718352 5033 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718380 5033 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718380 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718409 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718409 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.718560 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719033 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719146 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719180 5033 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719209 5033 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719241 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719263 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719286 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719315 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719337 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719360 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719383 5033 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719404 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719425 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719445 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719534 5033 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.719556 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.721926 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.722235 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.722793 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723063 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723114 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723347 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723652 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723682 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723837 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.723982 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.724268 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725344 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725388 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725407 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725441 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725489 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:52Z","lastTransitionTime":"2026-03-19T18:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.725804 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.727533 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.727577 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.732377 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.732502 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.732593 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.732722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.733015 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.733038 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.733055 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.733664 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:53.233643714 +0000 UTC m=+83.338673573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.734345 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.735786 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.736038 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.736054 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.735820 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.736365 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.737200 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:53.237146443 +0000 UTC m=+83.342176342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.737274 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.737305 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.737789 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.738139 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739001 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739031 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739275 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739415 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739419 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.739977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.740035 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.740204 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.740345 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.740417 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.740385 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.741350 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.742297 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.742440 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.743006 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.743927 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.744241 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.744912 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.744918 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.744951 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.745711 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.748671 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.750839 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.765971 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.766961 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.769224 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.778318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.782360 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.790602 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.798600 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.811936 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.820382 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.820577 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.820755 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.820968 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821174 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821314 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821437 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821647 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821776 5033 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.821908 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822032 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822158 5033 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822287 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822444 5033 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822612 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822744 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822862 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.822978 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823094 5033 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823210 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823336 5033 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823491 5033 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823626 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823747 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823867 5033 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.823984 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824105 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824221 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824338 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824485 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824621 5033 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824748 5033 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824874 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.824995 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825113 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825232 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825349 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825518 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825670 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825793 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.825918 5033 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826038 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826161 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826286 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826413 5033 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826590 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826752 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.826914 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827036 5033 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827202 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827337 5033 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827490 5033 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827624 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827749 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827875 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.827987 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828110 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828233 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828345 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828497 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828635 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828724 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828745 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828772 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828789 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:52Z","lastTransitionTime":"2026-03-19T18:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828642 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828934 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828956 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828978 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.828998 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829015 5033 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829033 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829051 5033 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829070 5033 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829088 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829105 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829125 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829142 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829159 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829176 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829193 5033 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829210 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829227 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829245 5033 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829262 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829278 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829295 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829312 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829328 5033 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829346 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829364 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829381 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829397 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829415 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829432 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829485 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829505 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829525 5033 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829542 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829613 5033 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829631 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829648 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829665 5033 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829682 5033 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829700 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829717 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829735 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829754 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829771 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829789 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829806 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829824 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829842 5033 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829860 5033 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829878 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829894 5033 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829911 5033 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829928 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829946 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829964 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829981 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.829997 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830017 5033 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830035 5033 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830052 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830070 5033 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830087 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830103 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830122 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830140 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830158 5033 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830176 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830193 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830210 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830226 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830244 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830261 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830278 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830297 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830315 5033 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830331 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830349 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830367 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830384 5033 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830401 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830419 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830439 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830485 5033 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830503 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830520 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830537 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830556 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.830577 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.899304 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.904781 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.912570 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.927070 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: source /etc/kubernetes/apiserver-url.env Mar 19 18:57:52 crc kubenswrapper[5033]: else Mar 19 18:57:52 crc kubenswrapper[5033]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 18:57:52 crc kubenswrapper[5033]: exit 1 Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.928778 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.932021 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.932074 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.932099 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.932127 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.932148 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:52Z","lastTransitionTime":"2026-03-19T18:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:52 crc kubenswrapper[5033]: W0319 18:57:52.945102 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-62c77604891c0692921895adce4a4af0e5483cd3a7975a60ec9b8c2fef0f654b WatchSource:0}: Error finding container 62c77604891c0692921895adce4a4af0e5483cd3a7975a60ec9b8c2fef0f654b: Status 404 returned error can't find the container with id 62c77604891c0692921895adce4a4af0e5483cd3a7975a60ec9b8c2fef0f654b Mar 19 18:57:52 crc kubenswrapper[5033]: W0319 18:57:52.946639 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-571c58b8a7ace7b37ea8d8105534176ffd2d41254650ce258a5f02de45ca93c2 WatchSource:0}: Error finding container 571c58b8a7ace7b37ea8d8105534176ffd2d41254650ce258a5f02de45ca93c2: Status 404 returned error can't find the container with id 571c58b8a7ace7b37ea8d8105534176ffd2d41254650ce258a5f02de45ca93c2 Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.950056 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: source "/env/_master" Mar 19 18:57:52 crc kubenswrapper[5033]: set +o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 18:57:52 crc kubenswrapper[5033]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 18:57:52 crc kubenswrapper[5033]: ho_enable="--enable-hybrid-overlay" Mar 19 18:57:52 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 18:57:52 crc kubenswrapper[5033]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 18:57:52 crc kubenswrapper[5033]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-host=127.0.0.1 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-port=9743 \ Mar 19 18:57:52 crc kubenswrapper[5033]: ${ho_enable} \ Mar 19 18:57:52 crc kubenswrapper[5033]: --enable-interconnect \ Mar 19 18:57:52 crc kubenswrapper[5033]: --disable-approver \ Mar 19 18:57:52 crc kubenswrapper[5033]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --wait-for-kubernetes-api=200s \ Mar 19 18:57:52 crc kubenswrapper[5033]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.951218 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.952399 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.952634 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: source "/env/_master" Mar 19 18:57:52 crc kubenswrapper[5033]: set +o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: Mar 19 18:57:52 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --disable-webhook \ Mar 19 18:57:52 crc kubenswrapper[5033]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.953976 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.962500 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"62c77604891c0692921895adce4a4af0e5483cd3a7975a60ec9b8c2fef0f654b"} Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.963963 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e1a4c3091b1b115553bf09616762372916060347504dbcc16f74490a6c7196ed"} Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.965537 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: source "/env/_master" Mar 19 18:57:52 crc kubenswrapper[5033]: set +o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 18:57:52 crc kubenswrapper[5033]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 18:57:52 crc kubenswrapper[5033]: ho_enable="--enable-hybrid-overlay" Mar 19 18:57:52 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 18:57:52 crc kubenswrapper[5033]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 18:57:52 crc kubenswrapper[5033]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-host=127.0.0.1 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --webhook-port=9743 \ Mar 19 18:57:52 crc kubenswrapper[5033]: ${ho_enable} \ Mar 19 18:57:52 crc kubenswrapper[5033]: --enable-interconnect \ Mar 19 18:57:52 crc kubenswrapper[5033]: --disable-approver \ Mar 19 18:57:52 crc kubenswrapper[5033]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --wait-for-kubernetes-api=200s \ Mar 19 18:57:52 crc kubenswrapper[5033]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.965716 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: source /etc/kubernetes/apiserver-url.env Mar 19 18:57:52 crc kubenswrapper[5033]: else Mar 19 18:57:52 crc kubenswrapper[5033]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 18:57:52 crc kubenswrapper[5033]: exit 1 Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.966698 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"571c58b8a7ace7b37ea8d8105534176ffd2d41254650ce258a5f02de45ca93c2"} Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.966821 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.967331 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:57:52 crc kubenswrapper[5033]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 18:57:52 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Mar 19 18:57:52 crc kubenswrapper[5033]: set -o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: source "/env/_master" Mar 19 18:57:52 crc kubenswrapper[5033]: set +o allexport Mar 19 18:57:52 crc kubenswrapper[5033]: fi Mar 19 18:57:52 crc kubenswrapper[5033]: Mar 19 18:57:52 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 18:57:52 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 18:57:52 crc kubenswrapper[5033]: --disable-webhook \ Mar 19 18:57:52 crc kubenswrapper[5033]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 18:57:52 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Mar 19 18:57:52 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 18:57:52 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.968559 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.969058 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 18:57:52 crc kubenswrapper[5033]: E0319 18:57:52.970296 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.982881 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:52 crc kubenswrapper[5033]: I0319 18:57:52.998754 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.012871 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.028414 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.035113 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.035155 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.035165 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.035187 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.035203 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.041906 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.057534 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.072521 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.087734 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.099529 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.115500 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.131301 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.138759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.138808 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.138825 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.138848 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.138908 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.152369 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.234204 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.234296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.234341 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.234397 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234425 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:54.234387439 +0000 UTC m=+84.339417328 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234567 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234598 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234622 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234634 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:54.234616317 +0000 UTC m=+84.339646206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234641 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234699 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:54.234687259 +0000 UTC m=+84.339717148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234710 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.234776 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:54.234755912 +0000 UTC m=+84.339785791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.241498 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.241548 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.241567 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.241591 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.241608 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.335189 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.335373 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.335398 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.335419 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:53 crc kubenswrapper[5033]: E0319 18:57:53.335543 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:54.335520094 +0000 UTC m=+84.440549983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.344874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.345137 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.345163 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.345195 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.345221 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.447700 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.447774 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.447792 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.447809 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.447822 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.550775 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.550827 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.550840 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.550859 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.550871 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.653429 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.653502 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.653513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.653530 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.653542 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.755993 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.756036 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.756047 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.756063 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.756077 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.859137 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.859200 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.859218 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.859241 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.859257 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.962159 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.962240 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.962263 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.962293 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:53 crc kubenswrapper[5033]: I0319 18:57:53.962317 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:53Z","lastTransitionTime":"2026-03-19T18:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.066250 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.066321 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.066338 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.066363 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.066380 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.169711 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.169793 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.169812 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.169837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.169854 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.244610 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.244735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.244787 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.244831 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.244905 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.244867484 +0000 UTC m=+86.349897373 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.244936 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245020 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245026 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.245002748 +0000 UTC m=+86.350032627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.244926 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245119 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.245094131 +0000 UTC m=+86.350124010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245147 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245173 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.245272 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.245241716 +0000 UTC m=+86.350271605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.273007 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.273080 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.273103 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.273134 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.273159 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.345992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.346249 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.346278 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.346296 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.346378 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.34635157 +0000 UTC m=+86.451381459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.377106 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.377170 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.377189 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.377214 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.377235 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.481549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.481618 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.481682 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.481708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.481725 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.585351 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.585413 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.585430 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.585493 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.585522 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.620212 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.620483 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.620637 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.620745 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.620878 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:54 crc kubenswrapper[5033]: E0319 18:57:54.621285 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.627393 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.628886 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.631249 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.633857 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.635703 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.637088 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.638578 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.641643 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.643967 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.646677 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.647924 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.650548 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.651657 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.652892 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.654815 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.656053 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.658148 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.659004 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.660202 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.662837 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.664042 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.666172 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.667095 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.669416 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.670256 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.671678 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.674605 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.675952 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.678501 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.679568 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.680720 5033 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.680965 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.683916 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.685123 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.687239 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.689392 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.689550 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.689570 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.689595 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.689613 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.690324 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.691671 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.692807 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.694164 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.698107 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.699222 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.701358 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.702860 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.704283 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.705442 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.706702 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.708027 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.709732 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.711765 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.713282 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.714349 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.716750 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.718056 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.719972 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.792376 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.792435 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.792537 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.792568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.792590 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.895012 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.895075 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.895111 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.895141 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.895161 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.998197 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.998310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.998329 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.998353 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:54 crc kubenswrapper[5033]: I0319 18:57:54.998374 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:54Z","lastTransitionTime":"2026-03-19T18:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.100848 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.100918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.100938 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.100962 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.100979 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.203627 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.203729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.203753 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.203781 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.203799 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.307100 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.307159 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.307177 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.307204 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.307223 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.410198 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.410257 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.410270 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.410286 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.410298 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.513307 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.513366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.513383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.513405 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.513423 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.615819 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.615877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.615895 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.615919 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.615935 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.637900 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.638014 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:57:55 crc kubenswrapper[5033]: E0319 18:57:55.638402 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.719056 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.719116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.719134 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.719160 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.719177 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.822251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.822320 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.822343 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.822374 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.822394 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.925130 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.925589 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.925606 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.925629 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.925645 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:55Z","lastTransitionTime":"2026-03-19T18:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:55 crc kubenswrapper[5033]: I0319 18:57:55.979564 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:57:55 crc kubenswrapper[5033]: E0319 18:57:55.979701 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.028131 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.028196 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.028221 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.028279 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.028303 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.131834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.131896 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.131913 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.131937 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.131954 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.233842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.233880 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.233888 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.233901 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.233911 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.262621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.262713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.262746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.262773 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262829 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:00.262799653 +0000 UTC m=+90.367829522 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262875 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262883 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262929 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262955 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262970 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.262937 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:00.262921897 +0000 UTC m=+90.367951756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.263033 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:00.26301427 +0000 UTC m=+90.368044119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.263044 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:00.263038571 +0000 UTC m=+90.368068420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.337014 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.337073 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.337091 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.337116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.337134 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.364074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.364280 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.364306 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.364330 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.364409 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:00.364387223 +0000 UTC m=+90.469417112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.439894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.440008 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.440025 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.440049 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.440066 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.543370 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.543410 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.543426 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.543489 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.543513 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.620472 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.620556 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.620603 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.620874 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.620980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:56 crc kubenswrapper[5033]: E0319 18:57:56.621252 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.645347 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.645385 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.645396 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.645414 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.645427 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.749701 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.749754 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.749768 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.749785 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.749798 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.852550 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.852623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.852641 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.852668 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.852687 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.955784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.955826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.955837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.955862 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:56 crc kubenswrapper[5033]: I0319 18:57:56.955875 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:56Z","lastTransitionTime":"2026-03-19T18:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.058759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.058800 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.058811 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.058827 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.058840 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.161326 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.161356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.161366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.161383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.161391 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.263728 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.263774 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.263787 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.263807 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.263823 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.366226 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.366262 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.366273 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.366287 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.366295 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.468129 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.468209 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.468228 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.468255 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.468273 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.570649 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.570696 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.570708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.570727 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.570744 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.672433 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.672489 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.672506 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.672526 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.672568 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.780579 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.780618 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.780627 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.780642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.780654 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.882174 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.882202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.882210 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.882222 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.882269 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.977251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.977308 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.977326 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.977354 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:57 crc kubenswrapper[5033]: I0319 18:57:57.977372 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:57Z","lastTransitionTime":"2026-03-19T18:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.006133 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.011326 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.011381 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.011402 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.011432 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.011482 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.022239 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.026386 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.026681 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.026755 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.027112 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.027184 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.038682 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.042830 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.042877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.042901 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.042930 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.042953 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.054868 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.059053 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.059141 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.059210 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.059235 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.059253 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.073949 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.074198 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.075947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.076028 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.076116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.076209 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.076246 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.178604 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.178656 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.178671 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.178694 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.178711 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.281306 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.281352 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.281363 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.281381 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.281393 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.383351 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.383395 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.383408 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.383425 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.383436 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.486177 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.486259 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.486326 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.486354 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.486371 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.588279 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.588312 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.588322 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.588336 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.588347 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.619870 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.619987 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.620055 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.619873 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.620238 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:58 crc kubenswrapper[5033]: E0319 18:57:58.620307 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.690933 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.690982 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.690994 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.691014 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.691028 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.793203 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.793242 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.793253 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.793269 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.793282 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.894917 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.894963 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.894975 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.894992 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.895004 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.996634 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.996695 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.996709 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.996724 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:58 crc kubenswrapper[5033]: I0319 18:57:58.996735 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:58Z","lastTransitionTime":"2026-03-19T18:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.098075 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.098129 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.098142 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.098161 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.098173 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.201180 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.201208 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.201216 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.201229 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.201258 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.303663 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.303716 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.303725 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.303738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.303764 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.405673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.405731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.405744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.405762 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.405777 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.507809 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.507881 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.507895 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.507912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.507925 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.610557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.610622 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.610632 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.610690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.610701 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.634634 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.713383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.713428 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.713437 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.713467 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.713481 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.816625 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.816697 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.816721 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.816752 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.816775 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.918765 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.918819 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.918831 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.918848 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:59 crc kubenswrapper[5033]: I0319 18:57:59.918865 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:59Z","lastTransitionTime":"2026-03-19T18:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.020400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.020423 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.020431 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.020442 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.020462 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.122441 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.122513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.122539 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.122560 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.122576 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.224500 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.224536 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.224547 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.224560 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.224569 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.309238 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.309377 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:08.309351564 +0000 UTC m=+98.414381423 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.309508 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.309538 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.309574 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.309692 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.309695 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.309734 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:08.309724216 +0000 UTC m=+98.414754075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.309775 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:08.309754887 +0000 UTC m=+98.414784776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.310130 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.310169 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.310186 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.310258 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:08.310239664 +0000 UTC m=+98.415269523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.327754 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.327795 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.327810 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.327829 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.327844 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.411047 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.411272 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.411311 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.411336 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.411417 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:08.411389579 +0000 UTC m=+98.516419478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.430356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.430391 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.430399 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.430414 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.430423 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.532715 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.532781 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.532803 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.532830 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.532854 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.619363 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.619373 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.619507 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.619560 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.619617 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:00 crc kubenswrapper[5033]: E0319 18:58:00.619726 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.627943 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.634526 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.634563 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.634573 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.634588 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.634600 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.636401 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.643731 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.651382 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.659116 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.684774 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97f3559-db1e-40d8-b9b7-b66d8fc22313\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d912c91e96987e8aaa14cc154d18cc611ac81125fa1f0f808703471be367105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3557e7fc840231124d9d0ad59f938cc557d38ec0fe52fc8492edb6a11560f617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b947a1ec11e629f395cec8f721ae101fa8a63878714be2d5b060f93dfa34267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7941ed812a7b1b0a9fa3a2e84c9c55982c901493321db52723d9726281a03bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82be8f4b8b6cbb8538c1e850b0300b0e876f3268fea10d8fa59bbf7b7ea67790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.697879 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0718fe93-d69d-4b40-a912-533ed06ad37b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:36.215817 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:36.215961 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:36.216639 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2833053112/tls.crt::/tmp/serving-cert-2833053112/tls.key\\\\\\\"\\\\nI0319 18:57:36.535936 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:36.538276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:36.538294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:36.538317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:36.538322 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:36.542415 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 18:57:36.542444 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542461 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542465 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:36.542468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:36.542471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:36.542474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 18:57:36.542760 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 18:57:36.543381 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.708899 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.737236 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.737265 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.737272 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.737287 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.737298 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.839897 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.839942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.839961 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.839979 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.839991 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.942162 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.942239 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.942258 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.942286 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:00 crc kubenswrapper[5033]: I0319 18:58:00.942305 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:00Z","lastTransitionTime":"2026-03-19T18:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.044394 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.044444 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.044479 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.044497 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.044510 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.146220 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.146275 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.146292 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.146314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.146329 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.249545 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.249592 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.249604 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.249623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.249636 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.352438 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.352637 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.352657 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.352683 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.352702 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.455683 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.455741 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.455752 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.455770 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.455783 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.558280 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.558348 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.558369 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.558400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.558422 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.660674 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.660708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.660717 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.660730 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.660739 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.763698 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.763755 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.763772 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.763791 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.763807 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.866820 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.866868 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.866883 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.866907 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.866931 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.969588 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.969675 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.969698 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.969723 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:01 crc kubenswrapper[5033]: I0319 18:58:01.969741 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:01Z","lastTransitionTime":"2026-03-19T18:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.071840 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.071922 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.071945 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.071970 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.071986 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.174150 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.174199 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.174216 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.174238 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.174254 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.277418 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.277740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.277826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.277925 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.278058 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.380920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.380980 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.380999 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.381022 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.381038 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.484023 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.484096 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.484112 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.484140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.484163 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.586572 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.586662 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.586676 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.586694 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.586706 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.620364 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.620397 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.620386 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:02 crc kubenswrapper[5033]: E0319 18:58:02.620539 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:02 crc kubenswrapper[5033]: E0319 18:58:02.620950 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:02 crc kubenswrapper[5033]: E0319 18:58:02.621043 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.689154 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.689217 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.689236 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.689261 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.689278 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.791975 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.792040 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.792052 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.792069 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.792105 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.894132 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.894183 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.894195 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.894213 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.894224 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.997405 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.998196 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.998330 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.998482 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:02 crc kubenswrapper[5033]: I0319 18:58:02.998625 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:02Z","lastTransitionTime":"2026-03-19T18:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.101221 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.101496 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.101590 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.101688 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.101785 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.150344 5033 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.204699 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.204741 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.204752 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.204768 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.204778 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.307476 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.307542 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.307565 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.307597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.307620 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.409912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.409978 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.409997 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.410020 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.410036 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.511650 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.511729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.511754 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.511783 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.511805 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.614856 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.614897 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.614905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.614920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.614930 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.717852 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.717895 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.717904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.717918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.717926 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.820407 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.820493 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.820514 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.820541 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.820562 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.922737 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.922775 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.922793 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.922814 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:03 crc kubenswrapper[5033]: I0319 18:58:03.922831 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:03Z","lastTransitionTime":"2026-03-19T18:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.023986 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.024016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.024024 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.024036 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.024044 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.127233 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.127294 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.127313 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.127338 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.127355 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.229699 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.229741 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.229759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.229779 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.229795 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.331798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.331867 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.331891 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.331918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.331938 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.434634 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.434675 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.434685 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.434704 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.434714 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.536603 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.536658 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.536676 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.536698 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.536716 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.620102 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.620186 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:04 crc kubenswrapper[5033]: E0319 18:58:04.620214 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:04 crc kubenswrapper[5033]: E0319 18:58:04.620346 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.620652 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:04 crc kubenswrapper[5033]: E0319 18:58:04.620732 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.639204 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.639271 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.639295 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.639328 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.639352 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.742509 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.742557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.742571 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.742590 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.742606 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.845603 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.845656 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.845671 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.845692 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.845707 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.948063 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.948113 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.948140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.948160 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:04 crc kubenswrapper[5033]: I0319 18:58:04.948171 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:04Z","lastTransitionTime":"2026-03-19T18:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.009341 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9dd832847543617c53caa8127cae8d107b50eb2602a46ac193c7c9f9450cf51e"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.023304 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.044966 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.053615 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.053637 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.053646 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.053659 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.053668 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.066369 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.082624 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.097189 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.113974 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97f3559-db1e-40d8-b9b7-b66d8fc22313\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d912c91e96987e8aaa14cc154d18cc611ac81125fa1f0f808703471be367105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3557e7fc840231124d9d0ad59f938cc557d38ec0fe52fc8492edb6a11560f617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b947a1ec11e629f395cec8f721ae101fa8a63878714be2d5b060f93dfa34267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7941ed812a7b1b0a9fa3a2e84c9c55982c901493321db52723d9726281a03bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82be8f4b8b6cbb8538c1e850b0300b0e876f3268fea10d8fa59bbf7b7ea67790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.122995 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0718fe93-d69d-4b40-a912-533ed06ad37b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:36.215817 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:36.215961 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:36.216639 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2833053112/tls.crt::/tmp/serving-cert-2833053112/tls.key\\\\\\\"\\\\nI0319 18:57:36.535936 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:36.538276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:36.538294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:36.538317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:36.538322 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:36.542415 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 18:57:36.542444 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542461 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542465 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:36.542468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:36.542471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:36.542474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 18:57:36.542760 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 18:57:36.543381 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.130770 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dd832847543617c53caa8127cae8d107b50eb2602a46ac193c7c9f9450cf51e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.155814 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.155853 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.155867 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.155883 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.155893 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.257766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.257834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.257853 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.257877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.257895 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.359242 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.359281 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.359289 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.359301 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.359309 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.460942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.460982 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.460990 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.461005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.461014 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.563528 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.563568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.563580 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.563597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.563609 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.665795 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.665828 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.665835 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.665851 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.665859 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.767744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.767782 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.767792 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.767807 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.767817 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.869838 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.869900 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.869918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.869950 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.869967 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.971836 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.971904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.971920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.971947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:05 crc kubenswrapper[5033]: I0319 18:58:05.971963 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:05Z","lastTransitionTime":"2026-03-19T18:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.074396 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.074440 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.074480 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.074499 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.074511 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.176161 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.176193 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.176202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.176214 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.176222 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.279041 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.279072 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.279081 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.279093 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.279101 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.381873 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.381917 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.381926 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.381941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.381950 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.483875 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.483908 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.483918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.483932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.483943 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.585484 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.585520 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.585531 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.585546 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.585557 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.619604 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.619758 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:06 crc kubenswrapper[5033]: E0319 18:58:06.619911 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.619967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:06 crc kubenswrapper[5033]: E0319 18:58:06.620201 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:06 crc kubenswrapper[5033]: E0319 18:58:06.620409 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.620616 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:58:06 crc kubenswrapper[5033]: E0319 18:58:06.620767 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.687844 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.687916 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.687929 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.687949 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.687959 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.790834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.790868 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.790879 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.790894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.790903 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.893169 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.893215 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.893223 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.893236 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.893243 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.994641 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.994671 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.994679 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.994691 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:06 crc kubenswrapper[5033]: I0319 18:58:06.994699 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:06Z","lastTransitionTime":"2026-03-19T18:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.097911 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.097944 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.097954 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.097969 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.097979 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.200321 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.200356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.200366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.200381 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.200389 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.303005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.303033 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.303041 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.303055 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.303063 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.405547 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.405585 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.405596 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.405611 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.405620 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.507277 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.507324 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.507335 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.507357 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.507372 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.609699 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.609737 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.609748 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.609765 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.609778 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.711798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.711877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.711892 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.711913 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.711928 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.815338 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.815403 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.815420 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.815445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.815493 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.918270 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.918319 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.918335 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.918357 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:07 crc kubenswrapper[5033]: I0319 18:58:07.918373 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:07Z","lastTransitionTime":"2026-03-19T18:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.020914 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.020964 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.020985 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.021010 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.021029 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.123967 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.124013 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.124030 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.124055 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.124071 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.226426 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.226491 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.226503 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.226521 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.226534 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.328098 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.328132 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.328142 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.328156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.328165 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.381222 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.381288 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.381309 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.381325 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381428 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381502 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:24.381479403 +0000 UTC m=+114.486509272 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381531 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:24.381520384 +0000 UTC m=+114.486550243 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381543 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381547 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381729 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381744 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381779 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:24.381767393 +0000 UTC m=+114.486797252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.381804 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:24.381795363 +0000 UTC m=+114.486825222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.396710 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.396783 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.396808 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.396840 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.396865 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.410583 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.414857 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.414936 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.414956 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.415011 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.415065 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.429287 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.432792 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.432841 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.432862 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.432890 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.432911 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.447109 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.450738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.450781 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.450798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.450822 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.450837 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.465493 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.468812 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.468881 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.468906 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.468936 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.468959 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.478994 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0cdcec62-f1e9-46b5-ad2f-f7ca42c1ed21\\\",\\\"systemUUID\\\":\\\"4c45382c-f0a8-4377-81e5-4e3ff4799b14\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.479206 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.480752 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.480811 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.480824 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.480857 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.480868 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.482024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.482165 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.482188 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.482199 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.482245 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:24.482230534 +0000 UTC m=+114.587260383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.583594 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.583642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.583658 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.583680 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.583698 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.620379 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.620427 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.620570 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.620651 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.620755 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:08 crc kubenswrapper[5033]: E0319 18:58:08.620870 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.686608 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.686886 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.686896 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.686912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.686921 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.789980 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.790019 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.790029 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.790047 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.790081 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.892997 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.893060 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.893080 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.893108 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.893127 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.995699 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.995739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.995749 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.995766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:08 crc kubenswrapper[5033]: I0319 18:58:08.995778 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:08Z","lastTransitionTime":"2026-03-19T18:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.022283 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"63fa44fd7da587a9b863a13e665d0a2d0c6974c6d1e07779f6fc11480854724d"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.024564 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d26f272a5f7e87332ff9b4c65c2a9a1f36770a3a16d1d2e9591305595f04e109"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.024597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"40eb826a4c2ef192b937fc6d29c453b282d74e5ad05823b2c7e56d8cbd43ed2c"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.039958 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.053099 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.068549 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.086974 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97f3559-db1e-40d8-b9b7-b66d8fc22313\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d912c91e96987e8aaa14cc154d18cc611ac81125fa1f0f808703471be367105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3557e7fc840231124d9d0ad59f938cc557d38ec0fe52fc8492edb6a11560f617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b947a1ec11e629f395cec8f721ae101fa8a63878714be2d5b060f93dfa34267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7941ed812a7b1b0a9fa3a2e84c9c55982c901493321db52723d9726281a03bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82be8f4b8b6cbb8538c1e850b0300b0e876f3268fea10d8fa59bbf7b7ea67790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.097836 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.097891 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.097905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.097928 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.097944 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.105831 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0718fe93-d69d-4b40-a912-533ed06ad37b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:36.215817 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:36.215961 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:36.216639 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2833053112/tls.crt::/tmp/serving-cert-2833053112/tls.key\\\\\\\"\\\\nI0319 18:57:36.535936 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:36.538276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:36.538294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:36.538317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:36.538322 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:36.542415 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 18:57:36.542444 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542461 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542465 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:36.542468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:36.542471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:36.542474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 18:57:36.542760 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 18:57:36.543381 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.120471 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dd832847543617c53caa8127cae8d107b50eb2602a46ac193c7c9f9450cf51e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.131571 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.142681 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fa44fd7da587a9b863a13e665d0a2d0c6974c6d1e07779f6fc11480854724d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.172043 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.200114 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.200155 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.200166 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.200184 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.200197 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.201563 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97f3559-db1e-40d8-b9b7-b66d8fc22313\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d912c91e96987e8aaa14cc154d18cc611ac81125fa1f0f808703471be367105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3557e7fc840231124d9d0ad59f938cc557d38ec0fe52fc8492edb6a11560f617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b947a1ec11e629f395cec8f721ae101fa8a63878714be2d5b060f93dfa34267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7941ed812a7b1b0a9fa3a2e84c9c55982c901493321db52723d9726281a03bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82be8f4b8b6cbb8538c1e850b0300b0e876f3268fea10d8fa59bbf7b7ea67790\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3932a4f2039c85f8a9f7e6c0dff265dcede03d6b8d777e0c456aef930bee095\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e75b82fb65d68062d62039b4c888634160577b16ed367d44683928ea1a68469f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92e2de63ba0f624d8bc4a8f6a6591f279d98766f82737ea59fe34c8bcc5ef9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.211304 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0718fe93-d69d-4b40-a912-533ed06ad37b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:36.215817 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:36.215961 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:36.216639 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2833053112/tls.crt::/tmp/serving-cert-2833053112/tls.key\\\\\\\"\\\\nI0319 18:57:36.535936 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:36.538276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:36.538294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:36.538317 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:36.538322 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:36.542415 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0319 18:57:36.542444 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542461 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:36.542465 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:36.542468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:36.542471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:36.542474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0319 18:57:36.542760 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0319 18:57:36.543381 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.221405 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dd832847543617c53caa8127cae8d107b50eb2602a46ac193c7c9f9450cf51e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.229698 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.237180 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.244920 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:58:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d26f272a5f7e87332ff9b4c65c2a9a1f36770a3a16d1d2e9591305595f04e109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40eb826a4c2ef192b937fc6d29c453b282d74e5ad05823b2c7e56d8cbd43ed2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.302079 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.302116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.302126 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.302141 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.302151 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.405142 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.405198 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.405218 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.405241 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.405259 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.507979 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.508018 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.508027 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.508042 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.508052 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.611048 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.611120 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.611138 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.611164 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.611182 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.714192 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.714269 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.714298 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.714329 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.714351 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.817282 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.817334 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.817351 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.817375 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.817396 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.920734 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.920800 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.920824 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.920850 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:09 crc kubenswrapper[5033]: I0319 18:58:09.920877 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:09Z","lastTransitionTime":"2026-03-19T18:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.023256 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.023374 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.023401 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.023496 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.023524 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.126267 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.126340 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.126374 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.126405 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.126425 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.229247 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.229307 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.229324 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.229348 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.229366 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.332631 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.332689 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.332711 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.332740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.332761 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.414487 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jl8cj"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.415927 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.418523 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.418949 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.418993 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.429142 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-779xw"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.429636 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.430233 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-t2488"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.431280 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.433182 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d5qfn"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.433799 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.434932 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.434959 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.434948 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.435274 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.435284 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.435308 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.435339 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.435351 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.436196 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.437111 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.437839 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438442 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438527 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438546 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438571 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438588 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.438658 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.461273 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bk4w2"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.462513 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.464083 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.464168 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.464570 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.464832 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.465218 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.465791 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.467352 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.497805 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=11.497781052 podStartE2EDuration="11.497781052s" podCreationTimestamp="2026-03-19 18:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:10.497368798 +0000 UTC m=+100.602398647" watchObservedRunningTime="2026-03-19 18:58:10.497781052 +0000 UTC m=+100.602810911" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499694 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499756 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-k8s-cni-cncf-io\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-socket-dir-parent\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjm52\" (UniqueName: \"kubernetes.io/projected/c960a9d1-3c99-4e77-9906-e319e0aed817-kube-api-access-wjm52\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499811 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-conf-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499876 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cthcm\" (UniqueName: \"kubernetes.io/projected/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-kube-api-access-cthcm\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499915 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.499994 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500033 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cni-binary-copy\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500097 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-kubelet\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500161 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-system-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500240 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-etc-kubernetes\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500262 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500284 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c960a9d1-3c99-4e77-9906-e319e0aed817-proxy-tls\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500317 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-os-release\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500354 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500375 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e0e9791-5b4d-47a1-a475-888179c53064-hosts-file\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500422 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-bin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500443 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500485 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c960a9d1-3c99-4e77-9906-e319e0aed817-rootfs\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500511 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqndh\" (UniqueName: \"kubernetes.io/projected/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-kube-api-access-zqndh\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500548 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500595 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500635 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500656 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-daemon-config\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500698 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cnibin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500729 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500755 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500800 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500820 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-system-cni-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500861 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cnibin\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500882 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-os-release\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500910 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-netns\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500929 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-multus\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500971 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmnzc\" (UniqueName: \"kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.500997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c960a9d1-3c99-4e77-9906-e319e0aed817-mcd-auth-proxy-config\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-hostroot\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501036 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-multus-certs\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501054 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnq2\" (UniqueName: \"kubernetes.io/projected/1e0e9791-5b4d-47a1-a475-888179c53064-kube-api-access-zcnq2\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.501125 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.540685 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.540712 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.540721 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.540735 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.540745 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602369 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-kubelet\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cni-binary-copy\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602495 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-system-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602517 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-etc-kubernetes\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602547 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-kubelet\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602568 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-os-release\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602593 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e0e9791-5b4d-47a1-a475-888179c53064-hosts-file\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602652 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c960a9d1-3c99-4e77-9906-e319e0aed817-proxy-tls\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602647 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-etc-kubernetes\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602674 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-system-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602689 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-os-release\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602722 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602724 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-bin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602753 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-bin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c960a9d1-3c99-4e77-9906-e319e0aed817-rootfs\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e0e9791-5b4d-47a1-a475-888179c53064-hosts-file\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c960a9d1-3c99-4e77-9906-e319e0aed817-rootfs\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602806 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602810 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602840 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602950 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602781 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603003 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqndh\" (UniqueName: \"kubernetes.io/projected/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-kube-api-access-zqndh\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603054 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-daemon-config\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603076 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cnibin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603103 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603172 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603202 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603224 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-system-cni-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-os-release\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-netns\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603289 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-multus\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603309 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603330 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmnzc\" (UniqueName: \"kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603339 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603351 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cnibin\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-hostroot\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603396 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cnibin\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603402 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-multus-certs\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-netns\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.602997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603427 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603436 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603499 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-var-lib-cni-multus\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603500 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603498 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-system-cni-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603527 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603557 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-hostroot\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnq2\" (UniqueName: \"kubernetes.io/projected/1e0e9791-5b4d-47a1-a475-888179c53064-kube-api-access-zcnq2\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-multus-certs\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603596 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c960a9d1-3c99-4e77-9906-e319e0aed817-mcd-auth-proxy-config\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603865 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-os-release\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603901 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603902 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-daemon-config\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603926 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cnibin\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603965 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.603996 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-cni-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604012 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604025 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604121 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604083 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-cni-binary-copy\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604165 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604199 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-k8s-cni-cncf-io\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604250 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604273 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-host-run-k8s-cni-cncf-io\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-socket-dir-parent\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604341 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjm52\" (UniqueName: \"kubernetes.io/projected/c960a9d1-3c99-4e77-9906-e319e0aed817-kube-api-access-wjm52\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604383 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-socket-dir-parent\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604395 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604409 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-conf-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604464 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-multus-conf-dir\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604475 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cthcm\" (UniqueName: \"kubernetes.io/projected/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-kube-api-access-cthcm\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604538 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604654 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604655 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604673 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604738 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c960a9d1-3c99-4e77-9906-e319e0aed817-mcd-auth-proxy-config\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.604810 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-binary-copy\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.605007 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.605432 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.608162 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.609424 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c960a9d1-3c99-4e77-9906-e319e0aed817-proxy-tls\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.619550 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.619626 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.619681 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:10 crc kubenswrapper[5033]: E0319 18:58:10.619853 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:10 crc kubenswrapper[5033]: E0319 18:58:10.619950 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:10 crc kubenswrapper[5033]: E0319 18:58:10.620029 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.627220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmnzc\" (UniqueName: \"kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc\") pod \"ovnkube-node-bk4w2\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.627590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cthcm\" (UniqueName: \"kubernetes.io/projected/4a7b8904-0121-4d6c-849e-1ebfa3af0c61-kube-api-access-cthcm\") pod \"multus-d5qfn\" (UID: \"4a7b8904-0121-4d6c-849e-1ebfa3af0c61\") " pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.628293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqndh\" (UniqueName: \"kubernetes.io/projected/ff0b40a7-0292-4c40-a9de-a5a1e52062c1-kube-api-access-zqndh\") pod \"multus-additional-cni-plugins-t2488\" (UID: \"ff0b40a7-0292-4c40-a9de-a5a1e52062c1\") " pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.630913 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnq2\" (UniqueName: \"kubernetes.io/projected/1e0e9791-5b4d-47a1-a475-888179c53064-kube-api-access-zcnq2\") pod \"node-resolver-jl8cj\" (UID: \"1e0e9791-5b4d-47a1-a475-888179c53064\") " pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.634990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjm52\" (UniqueName: \"kubernetes.io/projected/c960a9d1-3c99-4e77-9906-e319e0aed817-kube-api-access-wjm52\") pod \"machine-config-daemon-779xw\" (UID: \"c960a9d1-3c99-4e77-9906-e319e0aed817\") " pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.646970 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647021 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647033 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647127 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647150 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647467 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2lqp4"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.647844 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.649685 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.649776 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.650600 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.650758 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.706075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpgrd\" (UniqueName: \"kubernetes.io/projected/89af5617-3b48-45aa-a561-5181a74508d3-kube-api-access-zpgrd\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.706133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89af5617-3b48-45aa-a561-5181a74508d3-serviceca\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.706156 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89af5617-3b48-45aa-a561-5181a74508d3-host\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.742230 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl8cj" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.749674 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.749716 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.749725 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.749739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.749748 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: W0319 18:58:10.752037 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e0e9791_5b4d_47a1_a475_888179c53064.slice/crio-3c32a8eabff55d94d438f914a6eedb3b5cec7e87e101cf4a3fedfb87000fcb85 WatchSource:0}: Error finding container 3c32a8eabff55d94d438f914a6eedb3b5cec7e87e101cf4a3fedfb87000fcb85: Status 404 returned error can't find the container with id 3c32a8eabff55d94d438f914a6eedb3b5cec7e87e101cf4a3fedfb87000fcb85 Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.757790 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.776033 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t2488" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.788033 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5qfn" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.789317 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.789815 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.791421 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.795685 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.797811 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.808479 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpgrd\" (UniqueName: \"kubernetes.io/projected/89af5617-3b48-45aa-a561-5181a74508d3-kube-api-access-zpgrd\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.808573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89af5617-3b48-45aa-a561-5181a74508d3-serviceca\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.808604 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89af5617-3b48-45aa-a561-5181a74508d3-host\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.808655 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89af5617-3b48-45aa-a561-5181a74508d3-host\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.810155 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89af5617-3b48-45aa-a561-5181a74508d3-serviceca\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.814061 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kcmn4"] Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.814589 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:10 crc kubenswrapper[5033]: E0319 18:58:10.814647 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:10 crc kubenswrapper[5033]: W0319 18:58:10.824776 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7b8904_0121_4d6c_849e_1ebfa3af0c61.slice/crio-13c0662fbd54de5087f6eb9dd78c9fdb99531ef47b342934a65642f0e9ade40c WatchSource:0}: Error finding container 13c0662fbd54de5087f6eb9dd78c9fdb99531ef47b342934a65642f0e9ade40c: Status 404 returned error can't find the container with id 13c0662fbd54de5087f6eb9dd78c9fdb99531ef47b342934a65642f0e9ade40c Mar 19 18:58:10 crc kubenswrapper[5033]: W0319 18:58:10.825564 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb7906f1_92ba_45a7_9a54_82a77c8e3e66.slice/crio-ec319f58f8bcf43b271ba1d4458d441fa369e94ac625dd5efb142e3011d67080 WatchSource:0}: Error finding container ec319f58f8bcf43b271ba1d4458d441fa369e94ac625dd5efb142e3011d67080: Status 404 returned error can't find the container with id ec319f58f8bcf43b271ba1d4458d441fa369e94ac625dd5efb142e3011d67080 Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.826823 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpgrd\" (UniqueName: \"kubernetes.io/projected/89af5617-3b48-45aa-a561-5181a74508d3-kube-api-access-zpgrd\") pod \"node-ca-2lqp4\" (UID: \"89af5617-3b48-45aa-a561-5181a74508d3\") " pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.852652 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.852696 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.852708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.852728 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.852744 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.909959 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38915bd9-5b98-41a1-8335-399b06148c05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.910031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.910070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.910101 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khp4j\" (UniqueName: \"kubernetes.io/projected/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-kube-api-access-khp4j\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.910145 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.910193 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krpg\" (UniqueName: \"kubernetes.io/projected/38915bd9-5b98-41a1-8335-399b06148c05-kube-api-access-9krpg\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.955581 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.955644 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.955660 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.955682 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.955729 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:10Z","lastTransitionTime":"2026-03-19T18:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:10 crc kubenswrapper[5033]: I0319 18:58:10.965766 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2lqp4" Mar 19 18:58:10 crc kubenswrapper[5033]: W0319 18:58:10.985703 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89af5617_3b48_45aa_a561_5181a74508d3.slice/crio-7fc6319d90bf959b3f4cf3a011a2a081dab14d28e31df0a4eab3861d02b1936f WatchSource:0}: Error finding container 7fc6319d90bf959b3f4cf3a011a2a081dab14d28e31df0a4eab3861d02b1936f: Status 404 returned error can't find the container with id 7fc6319d90bf959b3f4cf3a011a2a081dab14d28e31df0a4eab3861d02b1936f Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011739 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krpg\" (UniqueName: \"kubernetes.io/projected/38915bd9-5b98-41a1-8335-399b06148c05-kube-api-access-9krpg\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38915bd9-5b98-41a1-8335-399b06148c05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011887 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khp4j\" (UniqueName: \"kubernetes.io/projected/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-kube-api-access-khp4j\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.011947 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:11 crc kubenswrapper[5033]: E0319 18:58:11.012116 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:11 crc kubenswrapper[5033]: E0319 18:58:11.012188 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs podName:5120920c-fe7c-454a-9dd5-9c0b79e0fb04 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.512167471 +0000 UTC m=+101.617197330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs") pod "network-metrics-daemon-kcmn4" (UID: "5120920c-fe7c-454a-9dd5-9c0b79e0fb04") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.012721 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.013163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38915bd9-5b98-41a1-8335-399b06148c05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.014651 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38915bd9-5b98-41a1-8335-399b06148c05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.030348 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krpg\" (UniqueName: \"kubernetes.io/projected/38915bd9-5b98-41a1-8335-399b06148c05-kube-api-access-9krpg\") pod \"ovnkube-control-plane-749d76644c-bfqwc\" (UID: \"38915bd9-5b98-41a1-8335-399b06148c05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.032678 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khp4j\" (UniqueName: \"kubernetes.io/projected/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-kube-api-access-khp4j\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.035333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl8cj" event={"ID":"1e0e9791-5b4d-47a1-a475-888179c53064","Type":"ContainerStarted","Data":"3c32a8eabff55d94d438f914a6eedb3b5cec7e87e101cf4a3fedfb87000fcb85"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.037868 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" exitCode=0 Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.037938 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.037969 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"ec319f58f8bcf43b271ba1d4458d441fa369e94ac625dd5efb142e3011d67080"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.040301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5qfn" event={"ID":"4a7b8904-0121-4d6c-849e-1ebfa3af0c61","Type":"ContainerStarted","Data":"e905b7626a19de4e8ef2773614c8f4558ce6031d89279d6de3f9c845f392f98f"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.040337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5qfn" event={"ID":"4a7b8904-0121-4d6c-849e-1ebfa3af0c61","Type":"ContainerStarted","Data":"13c0662fbd54de5087f6eb9dd78c9fdb99531ef47b342934a65642f0e9ade40c"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.041539 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerStarted","Data":"fe58f03d7930ce07ee027d1bbea6549026a96aa0eeac11f21339b82a548aa164"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.043854 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.043900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"000071a6640d5ee2c5c51e6dead3a3635af74ba3e6e03808044cf6ef8b873848"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.044868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2lqp4" event={"ID":"89af5617-3b48-45aa-a561-5181a74508d3","Type":"ContainerStarted","Data":"7fc6319d90bf959b3f4cf3a011a2a081dab14d28e31df0a4eab3861d02b1936f"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.058049 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.058090 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.058101 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.058116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.058125 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.091042 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d5qfn" podStartSLOduration=37.091025649 podStartE2EDuration="37.091025649s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:11.090993078 +0000 UTC m=+101.196022927" watchObservedRunningTime="2026-03-19 18:58:11.091025649 +0000 UTC m=+101.196055498" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.108051 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" Mar 19 18:58:11 crc kubenswrapper[5033]: W0319 18:58:11.122548 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38915bd9_5b98_41a1_8335_399b06148c05.slice/crio-cd8cb8d8cac4bc85653f5e7718b3ef5e38550507a0ad01f062a1cd882dd28a67 WatchSource:0}: Error finding container cd8cb8d8cac4bc85653f5e7718b3ef5e38550507a0ad01f062a1cd882dd28a67: Status 404 returned error can't find the container with id cd8cb8d8cac4bc85653f5e7718b3ef5e38550507a0ad01f062a1cd882dd28a67 Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.160267 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.160630 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.160642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.160660 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.160674 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.298692 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.298743 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.298755 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.298777 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.298794 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.401131 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.401174 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.401185 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.401203 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.401216 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.504196 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.504249 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.504259 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.504276 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.504286 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.515747 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:11 crc kubenswrapper[5033]: E0319 18:58:11.515864 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:11 crc kubenswrapper[5033]: E0319 18:58:11.515911 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs podName:5120920c-fe7c-454a-9dd5-9c0b79e0fb04 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.515897317 +0000 UTC m=+102.620927166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs") pod "network-metrics-daemon-kcmn4" (UID: "5120920c-fe7c-454a-9dd5-9c0b79e0fb04") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.606541 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.606576 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.606585 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.606599 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.606609 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.708605 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.708639 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.708648 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.708688 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.708698 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.811256 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.811287 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.811295 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.811309 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.811320 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.914568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.914838 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.914850 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.914869 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:11 crc kubenswrapper[5033]: I0319 18:58:11.914880 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:11Z","lastTransitionTime":"2026-03-19T18:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.018278 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.018307 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.018314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.018329 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.018339 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.049228 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl8cj" event={"ID":"1e0e9791-5b4d-47a1-a475-888179c53064","Type":"ContainerStarted","Data":"2f2961ad81d10f6cf8bfa72486e51361d24d147d2561a889a0100b3786c9e39c"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.050488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2lqp4" event={"ID":"89af5617-3b48-45aa-a561-5181a74508d3","Type":"ContainerStarted","Data":"b9139ac59b6eb1392cdf26565617cc44c99d2bb8fc1863c49444ba31acf838d4"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.052125 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="22f7a337087c8d6f428f36e2ffb73699d90ba8c3e2792253e203de4d1f429866" exitCode=0 Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.052207 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"22f7a337087c8d6f428f36e2ffb73699d90ba8c3e2792253e203de4d1f429866"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.054052 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"12baba7615be1225a0be44bf6b4ca051deb00fd0c03cc04628c9870422dddc15"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.055698 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" event={"ID":"38915bd9-5b98-41a1-8335-399b06148c05","Type":"ContainerStarted","Data":"2785a8462ea18fe18dd5d565b2b1209a3a1c8ece2d0136cf791b100ab09b3fd3"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.055736 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" event={"ID":"38915bd9-5b98-41a1-8335-399b06148c05","Type":"ContainerStarted","Data":"e5ca7f159446115c7568ab4529bdaee8b016a381ae7a1e09aeaf746f2df0d24d"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.055750 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" event={"ID":"38915bd9-5b98-41a1-8335-399b06148c05","Type":"ContainerStarted","Data":"cd8cb8d8cac4bc85653f5e7718b3ef5e38550507a0ad01f062a1cd882dd28a67"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059431 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059477 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.059486 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.084619 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jl8cj" podStartSLOduration=38.084600951 podStartE2EDuration="38.084600951s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.066169425 +0000 UTC m=+102.171199274" watchObservedRunningTime="2026-03-19 18:58:12.084600951 +0000 UTC m=+102.189630800" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.094843 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podStartSLOduration=38.094822828 podStartE2EDuration="38.094822828s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.084398794 +0000 UTC m=+102.189428673" watchObservedRunningTime="2026-03-19 18:58:12.094822828 +0000 UTC m=+102.199852687" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.095347 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2lqp4" podStartSLOduration=38.095342205 podStartE2EDuration="38.095342205s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.094986153 +0000 UTC m=+102.200016002" watchObservedRunningTime="2026-03-19 18:58:12.095342205 +0000 UTC m=+102.200372074" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.120274 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.120312 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.120324 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.120343 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.120354 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.139154 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bfqwc" podStartSLOduration=37.139136502 podStartE2EDuration="37.139136502s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.138493831 +0000 UTC m=+102.243523690" watchObservedRunningTime="2026-03-19 18:58:12.139136502 +0000 UTC m=+102.244166361" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.222370 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.222400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.222408 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.222421 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.222429 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.325269 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.325640 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.325655 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.325673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.325684 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.428275 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.428313 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.428322 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.428336 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.428346 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.523049 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.523178 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.523237 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs podName:5120920c-fe7c-454a-9dd5-9c0b79e0fb04 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.523219685 +0000 UTC m=+104.628249534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs") pod "network-metrics-daemon-kcmn4" (UID: "5120920c-fe7c-454a-9dd5-9c0b79e0fb04") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.530849 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.530896 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.530912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.530936 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.530954 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.620071 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.620080 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.620095 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.620423 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.620232 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.620098 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.620505 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:12 crc kubenswrapper[5033]: E0319 18:58:12.620552 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.633186 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.633233 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.633242 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.633256 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.633265 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.735742 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.735772 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.735782 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.735797 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.735808 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.837613 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.837662 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.837679 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.837697 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.837721 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.939493 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.939537 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.939549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.939568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:12 crc kubenswrapper[5033]: I0319 18:58:12.939580 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:12Z","lastTransitionTime":"2026-03-19T18:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.042303 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.042347 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.042355 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.042369 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.042378 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.064128 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="beb58ae2a0d60dbfcd489f7a82cb293a834a1cb12e4578c9fc4416bedf8bcc83" exitCode=0 Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.064257 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"beb58ae2a0d60dbfcd489f7a82cb293a834a1cb12e4578c9fc4416bedf8bcc83"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.145080 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.145402 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.145412 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.145424 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.145433 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.247310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.247344 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.247352 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.247366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.247374 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.351359 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.351385 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.351393 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.351406 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.351413 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.453429 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.453472 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.453480 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.453493 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.453500 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.555482 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.555510 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.555518 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.555530 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.555538 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.657714 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.657758 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.657769 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.657785 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.657794 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.760835 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.760895 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.760913 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.760937 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.760955 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.863704 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.864011 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.864336 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.865584 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.865608 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.967820 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.967912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.967931 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.967954 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:13 crc kubenswrapper[5033]: I0319 18:58:13.967970 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:13Z","lastTransitionTime":"2026-03-19T18:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.069814 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.069914 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.069975 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.070005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.070070 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.073287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.075959 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="b760f6709626164bab49bdc8102d431d1d5f5e3d0de5b0ec8b78da1465027ed7" exitCode=0 Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.076006 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"b760f6709626164bab49bdc8102d431d1d5f5e3d0de5b0ec8b78da1465027ed7"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.172083 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.172127 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.172140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.172158 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.172171 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.274698 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.275047 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.275084 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.275118 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.275140 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.379169 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.379400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.379409 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.379448 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.379469 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.483423 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.483484 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.483496 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.483513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.483524 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.544733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.544847 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.544895 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs podName:5120920c-fe7c-454a-9dd5-9c0b79e0fb04 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.544882641 +0000 UTC m=+108.649912490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs") pod "network-metrics-daemon-kcmn4" (UID: "5120920c-fe7c-454a-9dd5-9c0b79e0fb04") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.585465 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.585500 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.585513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.585528 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.585538 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.620197 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.620224 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.620314 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.620311 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.620197 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.620414 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.620492 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:14 crc kubenswrapper[5033]: E0319 18:58:14.620546 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.690594 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.690623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.690630 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.690643 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.690652 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.793423 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.793495 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.793508 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.793524 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.793537 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.895685 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.895723 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.895739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.895757 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:14 crc kubenswrapper[5033]: I0319 18:58:14.895769 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:14Z","lastTransitionTime":"2026-03-19T18:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.000959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.001003 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.001014 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.001033 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.001047 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.083832 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="056e1eb8c8b618d8f9603d02ee4f5add68e4b0f30acf6fda1d0c2c6fea2a7ad8" exitCode=0 Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.083903 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"056e1eb8c8b618d8f9603d02ee4f5add68e4b0f30acf6fda1d0c2c6fea2a7ad8"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.106365 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.106414 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.106427 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.106445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.106488 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.208991 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.209038 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.209053 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.209074 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.209091 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.312070 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.312100 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.312109 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.312123 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.312132 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.415901 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.415932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.415940 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.415973 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.415982 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.518867 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.518918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.518934 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.518959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.518977 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.621517 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.621576 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.621593 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.621615 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.621631 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.723690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.723731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.723742 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.723759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.723772 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.827345 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.827394 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.827422 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.827445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.827496 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.930499 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.930563 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.930582 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.930606 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:15 crc kubenswrapper[5033]: I0319 18:58:15.930625 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:15Z","lastTransitionTime":"2026-03-19T18:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.034849 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.035193 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.035212 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.035237 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.035256 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.092570 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="1a319ffeee1011c09fb694dc783488dc7ebf3ea5cc5b30866076ab1ca9c08704" exitCode=0 Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.092634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"1a319ffeee1011c09fb694dc783488dc7ebf3ea5cc5b30866076ab1ca9c08704"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.138486 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.138533 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.138545 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.138563 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.138575 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.241775 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.241821 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.241834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.241854 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.241869 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.344401 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.344516 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.344529 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.344548 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.344559 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.446614 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.446639 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.446675 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.446696 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.446707 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.550116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.550150 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.550159 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.550173 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.550183 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.620732 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:16 crc kubenswrapper[5033]: E0319 18:58:16.620908 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.621010 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.621144 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:16 crc kubenswrapper[5033]: E0319 18:58:16.621188 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.621244 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:16 crc kubenswrapper[5033]: E0319 18:58:16.621361 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:16 crc kubenswrapper[5033]: E0319 18:58:16.621600 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.653918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.653949 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.653959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.653975 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.653984 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.756847 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.756878 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.756886 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.756899 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.756907 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.859222 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.859260 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.859272 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.859288 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.859300 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.961271 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.961366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.961389 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.961417 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:16 crc kubenswrapper[5033]: I0319 18:58:16.961483 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:16Z","lastTransitionTime":"2026-03-19T18:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.069786 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.069950 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.069974 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.069996 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.070016 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.102922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerStarted","Data":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.103447 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.103628 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.103770 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.109753 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff0b40a7-0292-4c40-a9de-a5a1e52062c1" containerID="537d666c23be774038c51aa84822addbe67ec9eff5b828fafad7b09424d9ff72" exitCode=0 Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.109794 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerDied","Data":"537d666c23be774038c51aa84822addbe67ec9eff5b828fafad7b09424d9ff72"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.144496 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podStartSLOduration=43.144477202 podStartE2EDuration="43.144477202s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.143949865 +0000 UTC m=+107.248979734" watchObservedRunningTime="2026-03-19 18:58:17.144477202 +0000 UTC m=+107.249507061" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.166505 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.171194 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.172770 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.172855 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.172913 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.172972 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.173028 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.274852 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.274894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.274905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.274924 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.274936 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.378234 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.378279 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.378288 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.378306 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.378316 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.480865 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.480922 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.480941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.480966 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.480985 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.584445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.584577 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.584588 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.584609 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.584625 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.687250 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.687279 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.687288 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.687300 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.687310 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.790332 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.790615 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.790701 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.790826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.790925 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.892983 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.893044 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.893061 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.893091 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.893109 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.995604 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.995661 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.995678 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.995706 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:17 crc kubenswrapper[5033]: I0319 18:58:17.995723 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:17Z","lastTransitionTime":"2026-03-19T18:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.098045 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.098098 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.098115 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.098138 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.098154 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.119379 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t2488" event={"ID":"ff0b40a7-0292-4c40-a9de-a5a1e52062c1","Type":"ContainerStarted","Data":"62de7645e378c01427f696171835c3eff1f5a53214da4f8b21223caa510922d6"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.159759 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t2488" podStartSLOduration=44.159735291 podStartE2EDuration="44.159735291s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:18.159268545 +0000 UTC m=+108.264298394" watchObservedRunningTime="2026-03-19 18:58:18.159735291 +0000 UTC m=+108.264765170" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.201077 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.201148 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.201171 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.201203 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.201224 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.305138 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.305387 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.305485 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.305561 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.305639 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.408016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.408058 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.408069 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.408086 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.408097 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.511118 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.511153 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.511165 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.511181 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.511192 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.587837 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.588238 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.588368 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs podName:5120920c-fe7c-454a-9dd5-9c0b79e0fb04 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:26.588351527 +0000 UTC m=+116.693381376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs") pod "network-metrics-daemon-kcmn4" (UID: "5120920c-fe7c-454a-9dd5-9c0b79e0fb04") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.614525 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.614607 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.614632 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.614674 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.614697 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.619823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.619862 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.619939 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.620077 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.620220 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.620299 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.620480 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:18 crc kubenswrapper[5033]: E0319 18:58:18.620668 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.717712 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.717921 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.718039 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.718121 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.718178 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.756523 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kcmn4"] Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.821068 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.821106 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.821117 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.821135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.821146 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.869544 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.869586 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.869599 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.869617 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.869629 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:58:18Z","lastTransitionTime":"2026-03-19T18:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.908960 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w"] Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.909281 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.911949 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.912133 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.912276 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.912632 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.991129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.991177 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6927c5b0-a30e-4447-b6f0-08cb27a176d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.991201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.991226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6927c5b0-a30e-4447-b6f0-08cb27a176d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:18 crc kubenswrapper[5033]: I0319 18:58:18.991265 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6927c5b0-a30e-4447-b6f0-08cb27a176d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092109 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6927c5b0-a30e-4447-b6f0-08cb27a176d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092208 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6927c5b0-a30e-4447-b6f0-08cb27a176d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092226 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6927c5b0-a30e-4447-b6f0-08cb27a176d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.092360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6927c5b0-a30e-4447-b6f0-08cb27a176d7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.093410 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6927c5b0-a30e-4447-b6f0-08cb27a176d7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.098490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6927c5b0-a30e-4447-b6f0-08cb27a176d7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.118199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6927c5b0-a30e-4447-b6f0-08cb27a176d7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wxw6w\" (UID: \"6927c5b0-a30e-4447-b6f0-08cb27a176d7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.122615 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:19 crc kubenswrapper[5033]: E0319 18:58:19.122770 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.224221 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.607057 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 18:58:19 crc kubenswrapper[5033]: I0319 18:58:19.616896 5033 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.127654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" event={"ID":"6927c5b0-a30e-4447-b6f0-08cb27a176d7","Type":"ContainerStarted","Data":"270a85fd0b3eb62bd68b78a3fc6a58642474d5c002eeb13e75eb4d755163754b"} Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.127747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" event={"ID":"6927c5b0-a30e-4447-b6f0-08cb27a176d7","Type":"ContainerStarted","Data":"fef335d5b0ce0d3c8174665b1278c77bb55fcf008dae2ed2ad3c9b2c040e5092"} Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.619888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.619964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.619967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:20 crc kubenswrapper[5033]: E0319 18:58:20.620899 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kcmn4" podUID="5120920c-fe7c-454a-9dd5-9c0b79e0fb04" Mar 19 18:58:20 crc kubenswrapper[5033]: I0319 18:58:20.620910 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:20 crc kubenswrapper[5033]: E0319 18:58:20.620980 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:20 crc kubenswrapper[5033]: E0319 18:58:20.621042 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:20 crc kubenswrapper[5033]: E0319 18:58:20.621100 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.621182 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.699144 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.699292 5033 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.739978 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wxw6w" podStartSLOduration=47.739957515 podStartE2EDuration="47.739957515s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:20.149070958 +0000 UTC m=+110.254100847" watchObservedRunningTime="2026-03-19 18:58:21.739957515 +0000 UTC m=+111.844987384" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.742518 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.742939 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.743207 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.743326 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.744679 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.745205 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.746022 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.746049 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.746150 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.746342 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.750664 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.751155 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.751374 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.751670 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.751784 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.751901 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752168 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752306 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752555 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752710 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752840 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.752949 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.754951 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.755139 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.755312 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.755445 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.755575 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.755779 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jwxnr"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.756183 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.756504 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.756792 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.761617 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-545dt"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.762101 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kd52d"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.762480 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.763084 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.764271 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.765137 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.765650 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.765801 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.765969 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.766125 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.766305 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.766424 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.766565 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.766686 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.767950 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.768050 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2rhhh"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.784086 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.787192 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.788537 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.792907 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.794282 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.794799 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795014 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.794800 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795192 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795273 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795306 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795372 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795517 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795558 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795661 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.795937 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x9vcw"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.796051 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.796321 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.796603 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.796660 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.796933 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cz22k"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.797124 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.797247 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.797874 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.797888 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.798211 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.798361 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.798747 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dq52v"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.799156 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.799614 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.800056 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.803063 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.803771 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-78xnn"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.804359 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.804433 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.807575 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.807685 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.807726 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.807812 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.807871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.814553 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.820341 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.820893 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.822343 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.822556 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.822715 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.822742 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.822988 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823309 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823330 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823350 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-image-import-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823369 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823385 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-policies\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823425 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823439 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-dir\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823476 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhqs\" (UniqueName: \"kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-serving-cert\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823523 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfs5c\" (UniqueName: \"kubernetes.io/projected/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-kube-api-access-sfs5c\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823539 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823554 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxn6\" (UniqueName: \"kubernetes.io/projected/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-kube-api-access-bfxn6\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-encryption-config\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-service-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823605 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823673 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823748 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rjj\" (UniqueName: \"kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823788 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823824 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-encryption-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzz6\" (UniqueName: \"kubernetes.io/projected/e1f5e565-ab78-4e77-9cd5-17fab05529cb-kube-api-access-fwzz6\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85xw\" (UniqueName: \"kubernetes.io/projected/701c9459-2b31-4294-b118-9caf879adb17-kube-api-access-j85xw\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823891 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-serving-cert\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823907 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823942 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-config\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823957 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.823972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824007 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-images\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca47e31e-6c9f-471d-86ca-58ea515fc112-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824047 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824489 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-node-pullsecrets\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-serving-cert\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknn7\" (UniqueName: \"kubernetes.io/projected/ca47e31e-6c9f-471d-86ca-58ea515fc112-kube-api-access-rknn7\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824726 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-client\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824747 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824792 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824824 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824846 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.824873 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmsg\" (UniqueName: \"kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825465 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/701c9459-2b31-4294-b118-9caf879adb17-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825497 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825528 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-client\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825551 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-config\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit-dir\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825772 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825825 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825863 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.825887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826151 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826254 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826501 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826736 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826776 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.826832 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827149 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827370 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827613 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827845 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827862 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827915 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827849 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.827987 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.828346 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.829096 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.829302 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.829845 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.830777 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.830871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.830787 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.831120 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.839030 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.841812 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.842144 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.842543 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.842857 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.854197 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.854368 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.854510 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.854555 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.855550 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.855720 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.855824 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.856249 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzpft"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.857062 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.860365 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.860591 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.860787 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.861325 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.861596 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.867400 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.869035 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.869280 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.869381 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.869674 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.869972 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.870826 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.871160 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.871384 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.871433 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.872012 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.873104 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.876550 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.877039 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8k27t"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.877711 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.877923 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.877948 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.878552 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.881981 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.882555 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.885177 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565778-gls8c"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.885716 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.886069 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.889392 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.889863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.892807 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.893362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.893679 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.894189 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.896890 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.897370 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.899569 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hb9f7"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.900061 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.901727 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69v2"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.902257 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.907684 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.908530 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.909101 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.910014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.910636 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.910833 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.911685 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.912591 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.914974 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.915197 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.917157 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.919875 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.922935 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x9vcw"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.923995 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-545dt"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926204 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926500 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926540 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926582 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-policies\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926614 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-image-import-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926619 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926633 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926688 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926705 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-dir\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926732 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhqs\" (UniqueName: \"kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926750 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfs5c\" (UniqueName: \"kubernetes.io/projected/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-kube-api-access-sfs5c\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926767 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926781 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926797 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-serving-cert\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926819 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-machine-approver-tls\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926841 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxn6\" (UniqueName: \"kubernetes.io/projected/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-kube-api-access-bfxn6\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926890 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-encryption-config\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926904 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-service-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916e9efa-04c5-4ce9-9d8c-47795bee49d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926941 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rjj\" (UniqueName: \"kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926962 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-auth-proxy-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.926997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-encryption-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927014 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927052 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc87808-1374-443f-985a-019e644a3153-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927085 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzz6\" (UniqueName: \"kubernetes.io/projected/e1f5e565-ab78-4e77-9cd5-17fab05529cb-kube-api-access-fwzz6\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927107 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85xw\" (UniqueName: \"kubernetes.io/projected/701c9459-2b31-4294-b118-9caf879adb17-kube-api-access-j85xw\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-serving-cert\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927140 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc87808-1374-443f-985a-019e644a3153-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzd8\" (UniqueName: \"kubernetes.io/projected/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-kube-api-access-tnzd8\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927197 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927215 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-config\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927231 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927246 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927537 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927574 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916e9efa-04c5-4ce9-9d8c-47795bee49d0-config\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927593 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-images\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927612 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca47e31e-6c9f-471d-86ca-58ea515fc112-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-node-pullsecrets\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927656 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-serving-cert\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927674 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknn7\" (UniqueName: \"kubernetes.io/projected/ca47e31e-6c9f-471d-86ca-58ea515fc112-kube-api-access-rknn7\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927694 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-client\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/916e9efa-04c5-4ce9-9d8c-47795bee49d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927743 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927761 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927779 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmsg\" (UniqueName: \"kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927819 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/701c9459-2b31-4294-b118-9caf879adb17-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927843 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-client\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927898 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927930 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75ws4\" (UniqueName: \"kubernetes.io/projected/acc87808-1374-443f-985a-019e644a3153-kube-api-access-75ws4\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit-dir\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927971 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-config\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927977 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-policies\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927996 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927466 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.928336 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-image-import-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.927381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.928424 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.928553 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.928752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-audit-dir\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.929028 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.929271 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.929666 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-config\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.929705 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-audit-dir\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.930583 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.931163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.931237 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-serving-cert\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.931366 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.931842 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.932337 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.932585 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca47e31e-6c9f-471d-86ca-58ea515fc112-images\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.932677 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/701c9459-2b31-4294-b118-9caf879adb17-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.932826 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-serving-cert\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.932881 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e1f5e565-ab78-4e77-9cd5-17fab05529cb-node-pullsecrets\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.933209 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-config\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-serving-cert\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934575 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jwxnr"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934598 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934609 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9gmzg"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.934944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.935365 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.936314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-encryption-config\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.936423 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.936441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.936897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.937032 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.937340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.937506 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-service-ca-bundle\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.937668 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.938070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.938243 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.938743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.939331 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.939868 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.940062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.940102 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-etcd-client\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.940404 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca47e31e-6c9f-471d-86ca-58ea515fc112-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.941139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e1f5e565-ab78-4e77-9cd5-17fab05529cb-etcd-client\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.941202 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cz22k"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.942721 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-78xnn"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.944166 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.947287 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8k27t"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.947314 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.947383 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.947812 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.950300 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.950325 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kd52d"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.950335 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-d94q2"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.950834 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.952002 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-encryption-config\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.952953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.967302 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.967349 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.970359 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.973643 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dq52v"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.976385 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.976429 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.976441 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.982128 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-gls8c"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.982164 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.982174 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.983847 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.984672 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.985913 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.990053 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.990793 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzpft"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.993071 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fvwzx"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.993893 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.994102 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.996469 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.997702 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69v2"] Mar 19 18:58:21 crc kubenswrapper[5033]: I0319 18:58:21.999007 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.001714 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2rhhh"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.004129 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gmzg"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.005401 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.005836 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.008320 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.010319 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fvwzx"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.013470 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-ntlnh"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.014616 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.014691 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g9hmn"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.022170 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g9hmn"] Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.022413 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.026011 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75ws4\" (UniqueName: \"kubernetes.io/projected/acc87808-1374-443f-985a-019e644a3153-kube-api-access-75ws4\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028667 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-machine-approver-tls\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916e9efa-04c5-4ce9-9d8c-47795bee49d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-auth-proxy-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028793 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc87808-1374-443f-985a-019e644a3153-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028819 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc87808-1374-443f-985a-019e644a3153-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzd8\" (UniqueName: \"kubernetes.io/projected/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-kube-api-access-tnzd8\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916e9efa-04c5-4ce9-9d8c-47795bee49d0-config\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.028887 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/916e9efa-04c5-4ce9-9d8c-47795bee49d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.029493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.029699 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-auth-proxy-config\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.031676 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-machine-approver-tls\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.045078 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.065191 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.085864 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.105006 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.125387 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.134360 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.135544 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8"} Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.136094 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.145485 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.185208 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.189754 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acc87808-1374-443f-985a-019e644a3153-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.205679 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.224821 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.231991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc87808-1374-443f-985a-019e644a3153-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.245758 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.265120 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.285609 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.306285 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.325773 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.345731 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.365363 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.384786 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.404887 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.425542 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.445266 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.465269 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.484639 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.506608 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.525598 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.546143 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.564838 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.570282 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916e9efa-04c5-4ce9-9d8c-47795bee49d0-config\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.587127 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.604867 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.619663 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.619889 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.619911 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.619912 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.620953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916e9efa-04c5-4ce9-9d8c-47795bee49d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.624906 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.646429 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.665133 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.685699 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.704894 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.725658 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.745507 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.771188 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.785425 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.805513 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.825885 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.845163 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.865229 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.885659 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.903733 5033 request.go:700] Waited for 1.013671214s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.911958 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.926011 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.945740 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.966332 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 18:58:22 crc kubenswrapper[5033]: I0319 18:58:22.987085 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.006503 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.044638 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.046321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.065045 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.085624 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.109980 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.124874 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.146199 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.165854 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.206570 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.226080 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.245054 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.266404 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.285313 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.305899 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.325226 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.345351 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.365570 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.385606 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.405552 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.426432 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.446198 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.465426 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.485381 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.506190 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.526080 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.545437 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.566189 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.608796 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhqs\" (UniqueName: \"kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs\") pod \"controller-manager-879f6c89f-fz4jq\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.622966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfs5c\" (UniqueName: \"kubernetes.io/projected/89ae05ba-3c04-4d2a-afe0-e379803a7b2f-kube-api-access-sfs5c\") pod \"authentication-operator-69f744f599-545dt\" (UID: \"89ae05ba-3c04-4d2a-afe0-e379803a7b2f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.638113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxn6\" (UniqueName: \"kubernetes.io/projected/82f15cf6-4ccb-45ef-ab62-87a903e0e29d-kube-api-access-bfxn6\") pod \"apiserver-7bbb656c7d-7bgbp\" (UID: \"82f15cf6-4ccb-45ef-ab62-87a903e0e29d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.655852 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.661415 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmsg\" (UniqueName: \"kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg\") pod \"route-controller-manager-6576b87f9c-h772c\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.668110 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.678859 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85xw\" (UniqueName: \"kubernetes.io/projected/701c9459-2b31-4294-b118-9caf879adb17-kube-api-access-j85xw\") pod \"cluster-samples-operator-665b6dd947-rwdr5\" (UID: \"701c9459-2b31-4294-b118-9caf879adb17\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.697635 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.701474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknn7\" (UniqueName: \"kubernetes.io/projected/ca47e31e-6c9f-471d-86ca-58ea515fc112-kube-api-access-rknn7\") pod \"machine-api-operator-5694c8668f-jwxnr\" (UID: \"ca47e31e-6c9f-471d-86ca-58ea515fc112\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.719968 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rjj\" (UniqueName: \"kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj\") pod \"oauth-openshift-558db77b4-kd52d\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.728387 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.747730 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.765529 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.787702 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.825015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzz6\" (UniqueName: \"kubernetes.io/projected/e1f5e565-ab78-4e77-9cd5-17fab05529cb-kube-api-access-fwzz6\") pod \"apiserver-76f77b778f-2rhhh\" (UID: \"e1f5e565-ab78-4e77-9cd5-17fab05529cb\") " pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.825922 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.845323 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.865974 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.885580 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.888473 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-545dt"] Mar 19 18:58:23 crc kubenswrapper[5033]: W0319 18:58:23.896284 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ae05ba_3c04_4d2a_afe0_e379803a7b2f.slice/crio-9f14b5a28a3adba947ac0a8eb3a0426c8a47132260aba27f42f077760509520d WatchSource:0}: Error finding container 9f14b5a28a3adba947ac0a8eb3a0426c8a47132260aba27f42f077760509520d: Status 404 returned error can't find the container with id 9f14b5a28a3adba947ac0a8eb3a0426c8a47132260aba27f42f077760509520d Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.904515 5033 request.go:700] Waited for 1.910403857s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.905584 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.908415 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.926075 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.944840 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.967323 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.976493 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.985855 5033 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.987630 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" Mar 19 18:58:23 crc kubenswrapper[5033]: I0319 18:58:23.992081 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.005529 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.007340 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.048966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75ws4\" (UniqueName: \"kubernetes.io/projected/acc87808-1374-443f-985a-019e644a3153-kube-api-access-75ws4\") pod \"kube-storage-version-migrator-operator-b67b599dd-hcqrs\" (UID: \"acc87808-1374-443f-985a-019e644a3153\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.063541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.067611 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.068004 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.069704 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/916e9efa-04c5-4ce9-9d8c-47795bee49d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jhcwp\" (UID: \"916e9efa-04c5-4ce9-9d8c-47795bee49d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.087706 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzd8\" (UniqueName: \"kubernetes.io/projected/fe2b65ba-5c7e-499d-8d55-e09ec227e12c-kube-api-access-tnzd8\") pod \"machine-approver-56656f9798-mvtt7\" (UID: \"fe2b65ba-5c7e-499d-8d55-e09ec227e12c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.105079 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.126051 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.156050 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.161667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" event={"ID":"82f15cf6-4ccb-45ef-ab62-87a903e0e29d","Type":"ContainerStarted","Data":"92663d9652f4baee1ca9b3b78efdf11840ed303e56e4fdadbf7c0a1abfcd8e82"} Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.161696 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.163415 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" event={"ID":"89ae05ba-3c04-4d2a-afe0-e379803a7b2f","Type":"ContainerStarted","Data":"26c970467deef5b725f7cf6ae3e3582ef574fcf6b8d09b7c4d274158bcf1ea0d"} Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.163443 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" event={"ID":"89ae05ba-3c04-4d2a-afe0-e379803a7b2f","Type":"ContainerStarted","Data":"9f14b5a28a3adba947ac0a8eb3a0426c8a47132260aba27f42f077760509520d"} Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.165690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" event={"ID":"dabe1e76-f401-4d8c-99a4-36f7acc7e241","Type":"ContainerStarted","Data":"790a3787fd177d9d40201cc965c57b45c8711dad22095e2add46de6c19a6dfab"} Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.165984 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.166929 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" event={"ID":"f123389a-beda-4156-bd02-56ccf2a479f1","Type":"ContainerStarted","Data":"343181e6aa68c0a3865417bd7540a90c24b2c1293f767b5309e4c88f1c820598"} Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.174871 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.185922 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.196328 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.205858 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.225107 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.296213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kd52d"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.324152 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2rhhh"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.475939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp"] Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.504820 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jwxnr"] Mar 19 18:58:24 crc kubenswrapper[5033]: W0319 18:58:24.513555 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod916e9efa_04c5_4ce9_9d8c_47795bee49d0.slice/crio-26fde067e0818588da0b27b4226acc095562a9e177235dd94f60eba0bf16be05 WatchSource:0}: Error finding container 26fde067e0818588da0b27b4226acc095562a9e177235dd94f60eba0bf16be05: Status 404 returned error can't find the container with id 26fde067e0818588da0b27b4226acc095562a9e177235dd94f60eba0bf16be05 Mar 19 18:58:24 crc kubenswrapper[5033]: W0319 18:58:24.515325 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca47e31e_6c9f_471d_86ca_58ea515fc112.slice/crio-fd20c80aa6f0dbaf17e5f5e09da906e022eee127ae1d8b213bbfd275d86acff7 WatchSource:0}: Error finding container fd20c80aa6f0dbaf17e5f5e09da906e022eee127ae1d8b213bbfd275d86acff7: Status 404 returned error can't find the container with id fd20c80aa6f0dbaf17e5f5e09da906e022eee127ae1d8b213bbfd275d86acff7 Mar 19 18:58:24 crc kubenswrapper[5033]: I0319 18:58:24.629172 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs"] Mar 19 18:58:24 crc kubenswrapper[5033]: W0319 18:58:24.630899 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc87808_1374_443f_985a_019e644a3153.slice/crio-28ed5f949939db328827b9c80522f58d596e7de377a7ac982df75b81b8fa9d73 WatchSource:0}: Error finding container 28ed5f949939db328827b9c80522f58d596e7de377a7ac982df75b81b8fa9d73: Status 404 returned error can't find the container with id 28ed5f949939db328827b9c80522f58d596e7de377a7ac982df75b81b8fa9d73 Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.171883 5033 generic.go:334] "Generic (PLEG): container finished" podID="82f15cf6-4ccb-45ef-ab62-87a903e0e29d" containerID="8ffcb6201a3ca8320cba5bd25898ddef8fe652ec96382d582292ce8478254e28" exitCode=0 Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.171954 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" event={"ID":"82f15cf6-4ccb-45ef-ab62-87a903e0e29d","Type":"ContainerDied","Data":"8ffcb6201a3ca8320cba5bd25898ddef8fe652ec96382d582292ce8478254e28"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.173932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" event={"ID":"ca47e31e-6c9f-471d-86ca-58ea515fc112","Type":"ContainerStarted","Data":"5654eee76e16399d42411f7bd1d1389418b25c89bf482b7cfd792e3937e415a2"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.173975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" event={"ID":"ca47e31e-6c9f-471d-86ca-58ea515fc112","Type":"ContainerStarted","Data":"7a227c9fb176fc9cb3557f4d7d5f106ffb14b318d694ab4c53ee21cb3fae1739"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.173992 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" event={"ID":"ca47e31e-6c9f-471d-86ca-58ea515fc112","Type":"ContainerStarted","Data":"fd20c80aa6f0dbaf17e5f5e09da906e022eee127ae1d8b213bbfd275d86acff7"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.175170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" event={"ID":"916e9efa-04c5-4ce9-9d8c-47795bee49d0","Type":"ContainerStarted","Data":"a4c91dc4680d5836683a738c9f145e856231e484faf999810a3ec1a63856d900"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.175236 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" event={"ID":"916e9efa-04c5-4ce9-9d8c-47795bee49d0","Type":"ContainerStarted","Data":"26fde067e0818588da0b27b4226acc095562a9e177235dd94f60eba0bf16be05"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.177342 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" event={"ID":"701c9459-2b31-4294-b118-9caf879adb17","Type":"ContainerStarted","Data":"6abe60407691253f9e1d9071b4b59837db12375d83d979465491045b64e7726d"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.177372 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" event={"ID":"701c9459-2b31-4294-b118-9caf879adb17","Type":"ContainerStarted","Data":"332735c9aa89a16e9a5a6d72bab1bf5d040a8fa0819f343824026afb00ed83a0"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.177384 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" event={"ID":"701c9459-2b31-4294-b118-9caf879adb17","Type":"ContainerStarted","Data":"2968c40f3a928b75c18504d99b9d67cb6289e96912c969fc02378365c5925503"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.178779 5033 generic.go:334] "Generic (PLEG): container finished" podID="e1f5e565-ab78-4e77-9cd5-17fab05529cb" containerID="16599c452a79cf75e3776e83ccfb2ee2940b2f99bf5c41a52a3d8c581ba741d1" exitCode=0 Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.178838 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" event={"ID":"e1f5e565-ab78-4e77-9cd5-17fab05529cb","Type":"ContainerDied","Data":"16599c452a79cf75e3776e83ccfb2ee2940b2f99bf5c41a52a3d8c581ba741d1"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.178858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" event={"ID":"e1f5e565-ab78-4e77-9cd5-17fab05529cb","Type":"ContainerStarted","Data":"ea214bcd5fc042391f36f6bb570943fb22f402ae134922c172e5251f65c4dc74"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.180745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" event={"ID":"f123389a-beda-4156-bd02-56ccf2a479f1","Type":"ContainerStarted","Data":"0fe1b544c17f74855bb0c576294157fc23c2333aa9087cc054bce5f18662842e"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.180911 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.182168 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" event={"ID":"dabe1e76-f401-4d8c-99a4-36f7acc7e241","Type":"ContainerStarted","Data":"9b7f72059c52fd7f7137968fb0f6dc6285a93d73ec410732ac3fe104eb540603"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.182322 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.183717 5033 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fz4jq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.183776 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.184899 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" event={"ID":"acc87808-1374-443f-985a-019e644a3153","Type":"ContainerStarted","Data":"6a535ce45d095e265b7ddd111bf187df9dfbd658f038635a419ccbe77446e52d"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.184937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" event={"ID":"acc87808-1374-443f-985a-019e644a3153","Type":"ContainerStarted","Data":"28ed5f949939db328827b9c80522f58d596e7de377a7ac982df75b81b8fa9d73"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.187923 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" event={"ID":"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c","Type":"ContainerStarted","Data":"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.187956 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" event={"ID":"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c","Type":"ContainerStarted","Data":"1d519a9aee51ba50ecf86e6c5de6f3628d46fdb1acc9205b26b115d3ddb34f43"} Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.188336 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.190089 5033 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kd52d container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.190143 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.354837 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:25 crc kubenswrapper[5033]: I0319 18:58:25.581284 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=30.581269955 podStartE2EDuration="30.581269955s" podCreationTimestamp="2026-03-19 18:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:25.578640986 +0000 UTC m=+115.683670835" watchObservedRunningTime="2026-03-19 18:58:25.581269955 +0000 UTC m=+115.686299804" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.013424 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.013401621 podStartE2EDuration="2.013401621s" podCreationTimestamp="2026-03-19 18:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.013151752 +0000 UTC m=+116.118181601" watchObservedRunningTime="2026-03-19 18:58:26.013401621 +0000 UTC m=+116.118431470" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.061696 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" podStartSLOduration=51.06167741 podStartE2EDuration="51.06167741s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.058307356 +0000 UTC m=+116.163337205" watchObservedRunningTime="2026-03-19 18:58:26.06167741 +0000 UTC m=+116.166707269" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.098515 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hcqrs" podStartSLOduration=51.09849999 podStartE2EDuration="51.09849999s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.096917337 +0000 UTC m=+116.201947186" watchObservedRunningTime="2026-03-19 18:58:26.09849999 +0000 UTC m=+116.203529839" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.175568 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jhcwp" podStartSLOduration=51.175553207 podStartE2EDuration="51.175553207s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.174707189 +0000 UTC m=+116.279737048" watchObservedRunningTime="2026-03-19 18:58:26.175553207 +0000 UTC m=+116.280583056" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.261026 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jwxnr" podStartSLOduration=51.261006039 podStartE2EDuration="51.261006039s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.258607188 +0000 UTC m=+116.363637027" watchObservedRunningTime="2026-03-19 18:58:26.261006039 +0000 UTC m=+116.366035888" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.302793 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rwdr5" podStartSLOduration=52.302775628 podStartE2EDuration="52.302775628s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.302118275 +0000 UTC m=+116.407148124" watchObservedRunningTime="2026-03-19 18:58:26.302775628 +0000 UTC m=+116.407805467" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.346619 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" podStartSLOduration=52.346601596 podStartE2EDuration="52.346601596s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.345434286 +0000 UTC m=+116.450464135" watchObservedRunningTime="2026-03-19 18:58:26.346601596 +0000 UTC m=+116.451631445" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.375644 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-545dt" podStartSLOduration=52.375627702 podStartE2EDuration="52.375627702s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.372283068 +0000 UTC m=+116.477312917" watchObservedRunningTime="2026-03-19 18:58:26.375627702 +0000 UTC m=+116.480657551" Mar 19 18:58:26 crc kubenswrapper[5033]: I0319 18:58:26.423191 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" podStartSLOduration=52.423166856 podStartE2EDuration="52.423166856s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:26.420244657 +0000 UTC m=+116.525274506" watchObservedRunningTime="2026-03-19 18:58:26.423166856 +0000 UTC m=+116.528196705" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.498379 5033 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.586179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.595837 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:59:01.595794518 +0000 UTC m=+151.700824407 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.595948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hjb\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596058 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596258 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596326 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596437 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596554 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596632 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596710 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.596796 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.597642 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.09762478 +0000 UTC m=+120.202654619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.598630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.598669 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.610310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5120920c-fe7c-454a-9dd5-9c0b79e0fb04-metrics-certs\") pod \"network-metrics-daemon-kcmn4\" (UID: \"5120920c-fe7c-454a-9dd5-9c0b79e0fb04\") " pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.610705 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.616369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.623234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:29 crc kubenswrapper[5033]: W0319 18:58:29.663827 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe2b65ba_5c7e_499d_8d55_e09ec227e12c.slice/crio-2062aa4966144f9f7d59fae5339ebc5ef45bd34c83b1627646cd560194d1c413 WatchSource:0}: Error finding container 2062aa4966144f9f7d59fae5339ebc5ef45bd34c83b1627646cd560194d1c413: Status 404 returned error can't find the container with id 2062aa4966144f9f7d59fae5339ebc5ef45bd34c83b1627646cd560194d1c413 Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.698102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.698439 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nlv\" (UniqueName: \"kubernetes.io/projected/369378a6-ffae-46d6-bcce-575e033d51b9-kube-api-access-q6nlv\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.698574 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.698644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/449dd8eb-1af2-4fec-b84f-42b567d38342-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.699120 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.199090176 +0000 UTC m=+120.304120045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.700421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49vz\" (UniqueName: \"kubernetes.io/projected/ac25d35e-036b-4d7f-a9e8-2417681f2a1c-kube-api-access-l49vz\") pod \"downloads-7954f5f757-x9vcw\" (UID: \"ac25d35e-036b-4d7f-a9e8-2417681f2a1c\") " pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.700591 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.700636 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.700741 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-srv-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.700926 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701035 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7tjc\" (UniqueName: \"kubernetes.io/projected/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-kube-api-access-w7tjc\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701137 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-service-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srddd\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-kube-api-access-srddd\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701241 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hjb\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701276 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2fdbed76-a278-424a-8c8a-74d11ac4195e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701308 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwvt\" (UniqueName: \"kubernetes.io/projected/1df2d9d5-b12f-4312-9e66-581123788ac5-kube-api-access-plwvt\") pod \"migrator-59844c95c7-msk9m\" (UID: \"1df2d9d5-b12f-4312-9e66-581123788ac5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701307 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701344 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701471 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369378a6-ffae-46d6-bcce-575e033d51b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-profile-collector-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040cee13-6799-4fe8-b64a-ad70d7b1185d-config\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701678 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040cee13-6799-4fe8-b64a-ad70d7b1185d-serving-cert\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-images\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701769 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-srv-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701804 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-config\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-trusted-ca\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701889 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-client\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701919 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxvfq\" (UniqueName: \"kubernetes.io/projected/3aea5341-06fc-4b54-90a5-860a408e6759-kube-api-access-pxvfq\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.701970 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69d34ba5-eb17-4b36-b155-12a51c887d79-metrics-tls\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702040 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbb9\" (UniqueName: \"kubernetes.io/projected/1e837d74-e49f-4138-aba8-8170e284aeb3-kube-api-access-nqbb9\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702107 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2fe23-8f48-484f-a955-85ea15c3c1fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702159 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e74482-1056-4bfe-995e-10ec0dd18796-trusted-ca\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.702851 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caab64e7-53a2-46de-833b-2c55da422e4b-serving-cert\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703273 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703358 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e837d74-e49f-4138-aba8-8170e284aeb3-proxy-tls\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703413 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703436 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbed76-a278-424a-8c8a-74d11ac4195e-serving-cert\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703685 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.703951 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.704354 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct9g\" (UniqueName: \"kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.704690 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/449dd8eb-1af2-4fec-b84f-42b567d38342-config\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.704938 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nwpl\" (UniqueName: \"kubernetes.io/projected/01738363-9e09-480f-b65d-a76f264289eb-kube-api-access-9nwpl\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.704975 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9wqq\" (UniqueName: \"kubernetes.io/projected/040cee13-6799-4fe8-b64a-ad70d7b1185d-kube-api-access-b9wqq\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.705017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-config\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.705242 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-serving-cert\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.705356 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.705391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.705647 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nrxm\" (UniqueName: \"kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.706083 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfz95\" (UniqueName: \"kubernetes.io/projected/e7c16707-41f6-4725-8944-f9e31b5ddbe6-kube-api-access-dfz95\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.706290 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e74482-1056-4bfe-995e-10ec0dd18796-metrics-tls\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.706748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.708106 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710345 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710410 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gl6\" (UniqueName: \"kubernetes.io/projected/69d34ba5-eb17-4b36-b155-12a51c887d79-kube-api-access-n4gl6\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/449dd8eb-1af2-4fec-b84f-42b567d38342-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxr5x\" (UniqueName: \"kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x\") pod \"auto-csr-approver-29565778-gls8c\" (UID: \"05cd9325-9740-4a70-98a7-3de9ebb30035\") " pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710579 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzqc\" (UniqueName: \"kubernetes.io/projected/2fdbed76-a278-424a-8c8a-74d11ac4195e-kube-api-access-rhzqc\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710630 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710659 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710691 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6tq\" (UniqueName: \"kubernetes.io/projected/84e2fe23-8f48-484f-a955-85ea15c3c1fa-kube-api-access-ng6tq\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710757 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqf8d\" (UniqueName: \"kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710876 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710908 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369378a6-ffae-46d6-bcce-575e033d51b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710938 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.710999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd1faec-c933-445c-a6cd-71491f96a8dd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.711051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.711115 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2fe23-8f48-484f-a955-85ea15c3c1fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.711148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.711194 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rsx4\" (UniqueName: \"kubernetes.io/projected/3fd1faec-c933-445c-a6cd-71491f96a8dd-kube-api-access-6rsx4\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.711225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5ml\" (UniqueName: \"kubernetes.io/projected/caab64e7-53a2-46de-833b-2c55da422e4b-kube-api-access-qq5ml\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.712952 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.212927726 +0000 UTC m=+120.317957615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.718762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.720239 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.723385 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hjb\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812481 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.812695 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.312657563 +0000 UTC m=+120.417687432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812826 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49vz\" (UniqueName: \"kubernetes.io/projected/ac25d35e-036b-4d7f-a9e8-2417681f2a1c-kube-api-access-l49vz\") pod \"downloads-7954f5f757-x9vcw\" (UID: \"ac25d35e-036b-4d7f-a9e8-2417681f2a1c\") " pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812895 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-srv-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjlj\" (UniqueName: \"kubernetes.io/projected/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-kube-api-access-6vjlj\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812945 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7tjc\" (UniqueName: \"kubernetes.io/projected/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-kube-api-access-w7tjc\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.812989 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-webhook-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813010 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-plugins-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-stats-auth\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813054 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-service-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813076 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srddd\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-kube-api-access-srddd\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-socket-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813131 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4c325-85dd-463a-ae31-c726008e9b08-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2fdbed76-a278-424a-8c8a-74d11ac4195e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813201 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plwvt\" (UniqueName: \"kubernetes.io/projected/1df2d9d5-b12f-4312-9e66-581123788ac5-kube-api-access-plwvt\") pod \"migrator-59844c95c7-msk9m\" (UID: \"1df2d9d5-b12f-4312-9e66-581123788ac5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813222 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813421 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369378a6-ffae-46d6-bcce-575e033d51b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813465 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-profile-collector-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813488 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040cee13-6799-4fe8-b64a-ad70d7b1185d-config\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813513 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn6qc\" (UniqueName: \"kubernetes.io/projected/84dcb066-a28e-4283-ba72-b2958adec642-kube-api-access-tn6qc\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813565 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040cee13-6799-4fe8-b64a-ad70d7b1185d-serving-cert\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813588 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-images\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813608 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-srv-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813634 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbw8\" (UniqueName: \"kubernetes.io/projected/a9dc5a6e-183c-4362-a581-598293c3310c-kube-api-access-kdbw8\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813658 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-config\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813678 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-client\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813720 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxvfq\" (UniqueName: \"kubernetes.io/projected/3aea5341-06fc-4b54-90a5-860a408e6759-kube-api-access-pxvfq\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-trusted-ca\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813762 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69d34ba5-eb17-4b36-b155-12a51c887d79-metrics-tls\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-cert\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813808 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88x8r\" (UniqueName: \"kubernetes.io/projected/b71787f5-93bf-434a-b67d-63be189d843e-kube-api-access-88x8r\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813853 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbb9\" (UniqueName: \"kubernetes.io/projected/1e837d74-e49f-4138-aba8-8170e284aeb3-kube-api-access-nqbb9\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2fe23-8f48-484f-a955-85ea15c3c1fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813894 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813918 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57t6b\" (UniqueName: \"kubernetes.io/projected/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-kube-api-access-57t6b\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813953 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e74482-1056-4bfe-995e-10ec0dd18796-trusted-ca\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813973 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-csi-data-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.813996 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caab64e7-53a2-46de-833b-2c55da422e4b-serving-cert\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814059 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e837d74-e49f-4138-aba8-8170e284aeb3-proxy-tls\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814102 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814125 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.814203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.816122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-images\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.816282 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040cee13-6799-4fe8-b64a-ad70d7b1185d-config\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.816476 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-trusted-ca\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.818159 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbed76-a278-424a-8c8a-74d11ac4195e-serving-cert\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.818802 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2fdbed76-a278-424a-8c8a-74d11ac4195e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.819232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.819478 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-default-certificate\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.819706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.820774 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-config\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.820797 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caab64e7-53a2-46de-833b-2c55da422e4b-serving-cert\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.820786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-service-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.821841 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e74482-1056-4bfe-995e-10ec0dd18796-trusted-ca\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbed76-a278-424a-8c8a-74d11ac4195e-serving-cert\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822799 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0fcc05b-da6b-436a-b895-8ae31a630fad-config-volume\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822872 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct9g\" (UniqueName: \"kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822898 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84dcb066-a28e-4283-ba72-b2958adec642-tmpfs\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-service-ca-bundle\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822965 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/449dd8eb-1af2-4fec-b84f-42b567d38342-config\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.822990 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824134 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/449dd8eb-1af2-4fec-b84f-42b567d38342-config\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824244 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nwpl\" (UniqueName: \"kubernetes.io/projected/01738363-9e09-480f-b65d-a76f264289eb-kube-api-access-9nwpl\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824294 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b71787f5-93bf-434a-b67d-63be189d843e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824321 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9wqq\" (UniqueName: \"kubernetes.io/projected/040cee13-6799-4fe8-b64a-ad70d7b1185d-kube-api-access-b9wqq\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824428 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-config\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5px77\" (UniqueName: \"kubernetes.io/projected/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-kube-api-access-5px77\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824560 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4bt\" (UniqueName: \"kubernetes.io/projected/e2933377-6670-4f1f-b5d7-ebe2a832c460-kube-api-access-hx4bt\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824590 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-serving-cert\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824697 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nrxm\" (UniqueName: \"kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.824723 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a4c325-85dd-463a-ae31-c726008e9b08-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.825398 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.825579 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caab64e7-53a2-46de-833b-2c55da422e4b-config\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.828044 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-serving-cert\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.828883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e837d74-e49f-4138-aba8-8170e284aeb3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.828907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.829698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.829702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf7nb\" (UniqueName: \"kubernetes.io/projected/15a23e16-4194-4773-a4e8-1c3515d31c5c-kube-api-access-jf7nb\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.829778 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a9dc5a6e-183c-4362-a581-598293c3310c-signing-key\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.830276 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfz95\" (UniqueName: \"kubernetes.io/projected/e7c16707-41f6-4725-8944-f9e31b5ddbe6-kube-api-access-dfz95\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.830323 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a9dc5a6e-183c-4362-a581-598293c3310c-signing-cabundle\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.830374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e74482-1056-4bfe-995e-10ec0dd18796-metrics-tls\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.830420 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.832403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834237 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gl6\" (UniqueName: \"kubernetes.io/projected/69d34ba5-eb17-4b36-b155-12a51c887d79-kube-api-access-n4gl6\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/449dd8eb-1af2-4fec-b84f-42b567d38342-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0fcc05b-da6b-436a-b895-8ae31a630fad-metrics-tls\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834770 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-client\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834899 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-certs\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834953 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxr5x\" (UniqueName: \"kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x\") pod \"auto-csr-approver-29565778-gls8c\" (UID: \"05cd9325-9740-4a70-98a7-3de9ebb30035\") " pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.834982 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzqc\" (UniqueName: \"kubernetes.io/projected/2fdbed76-a278-424a-8c8a-74d11ac4195e-kube-api-access-rhzqc\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.835046 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-mountpoint-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.835095 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.835442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.835491 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbq46\" (UniqueName: \"kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.842517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f8d44d7-9ede-4090-96da-f59dc7a10cf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l7dwm\" (UID: \"6f8d44d7-9ede-4090-96da-f59dc7a10cf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843603 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7tjc\" (UniqueName: \"kubernetes.io/projected/91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4-kube-api-access-w7tjc\") pod \"multus-admission-controller-857f4d67dd-8k27t\" (UID: \"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843620 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78e74482-1056-4bfe-995e-10ec0dd18796-metrics-tls\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6tq\" (UniqueName: \"kubernetes.io/projected/84e2fe23-8f48-484f-a955-85ea15c3c1fa-kube-api-access-ng6tq\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-node-bootstrap-token\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.843981 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-metrics-certs\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.844022 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.844295 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqf8d\" (UniqueName: \"kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.845539 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.845611 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.851299 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.851381 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-registration-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.851954 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxvfq\" (UniqueName: \"kubernetes.io/projected/3aea5341-06fc-4b54-90a5-860a408e6759-kube-api-access-pxvfq\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852680 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/449dd8eb-1af2-4fec-b84f-42b567d38342-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852717 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369378a6-ffae-46d6-bcce-575e033d51b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd1faec-c933-445c-a6cd-71491f96a8dd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.852775 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-proxy-tls\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853318 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369378a6-ffae-46d6-bcce-575e033d51b9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853357 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853423 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2fe23-8f48-484f-a955-85ea15c3c1fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853441 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853499 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rsx4\" (UniqueName: \"kubernetes.io/projected/3fd1faec-c933-445c-a6cd-71491f96a8dd-kube-api-access-6rsx4\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853517 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5ml\" (UniqueName: \"kubernetes.io/projected/caab64e7-53a2-46de-833b-2c55da422e4b-kube-api-access-qq5ml\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853537 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql7l\" (UniqueName: \"kubernetes.io/projected/b0fcc05b-da6b-436a-b895-8ae31a630fad-kube-api-access-5ql7l\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853578 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxp5s\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-kube-api-access-sxp5s\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nlv\" (UniqueName: \"kubernetes.io/projected/369378a6-ffae-46d6-bcce-575e033d51b9-kube-api-access-q6nlv\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.853694 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-profile-collector-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.856249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369378a6-ffae-46d6-bcce-575e033d51b9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.856674 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfz95\" (UniqueName: \"kubernetes.io/projected/e7c16707-41f6-4725-8944-f9e31b5ddbe6-kube-api-access-dfz95\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.856752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e837d74-e49f-4138-aba8-8170e284aeb3-proxy-tls\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.856831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbb9\" (UniqueName: \"kubernetes.io/projected/1e837d74-e49f-4138-aba8-8170e284aeb3-kube-api-access-nqbb9\") pod \"machine-config-operator-74547568cd-d4vcj\" (UID: \"1e837d74-e49f-4138-aba8-8170e284aeb3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.857001 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.356985118 +0000 UTC m=+120.462014977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.857368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/449dd8eb-1af2-4fec-b84f-42b567d38342-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.857889 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3aea5341-06fc-4b54-90a5-860a408e6759-etcd-ca\") pod \"etcd-operator-b45778765-dzpft\" (UID: \"3aea5341-06fc-4b54-90a5-860a408e6759\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.858257 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040cee13-6799-4fe8-b64a-ad70d7b1185d-serving-cert\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.858584 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srddd\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-kube-api-access-srddd\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.859876 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct9g\" (UniqueName: \"kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.860561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzqc\" (UniqueName: \"kubernetes.io/projected/2fdbed76-a278-424a-8c8a-74d11ac4195e-kube-api-access-rhzqc\") pod \"openshift-config-operator-7777fb866f-cz22k\" (UID: \"2fdbed76-a278-424a-8c8a-74d11ac4195e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.860649 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.860726 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.860835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2fe23-8f48-484f-a955-85ea15c3c1fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.861755 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nrxm\" (UniqueName: \"kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.862610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nwpl\" (UniqueName: \"kubernetes.io/projected/01738363-9e09-480f-b65d-a76f264289eb-kube-api-access-9nwpl\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.863420 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqvsx\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.864429 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6tq\" (UniqueName: \"kubernetes.io/projected/84e2fe23-8f48-484f-a955-85ea15c3c1fa-kube-api-access-ng6tq\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866224 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume\") pod \"collect-profiles-29565765-jfjkd\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866250 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gl6\" (UniqueName: \"kubernetes.io/projected/69d34ba5-eb17-4b36-b155-12a51c887d79-kube-api-access-n4gl6\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866280 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fd1faec-c933-445c-a6cd-71491f96a8dd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866639 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2fe23-8f48-484f-a955-85ea15c3c1fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47hmx\" (UID: \"84e2fe23-8f48-484f-a955-85ea15c3c1fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866666 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7c16707-41f6-4725-8944-f9e31b5ddbe6-srv-cert\") pod \"catalog-operator-68c6474976-2pw6j\" (UID: \"e7c16707-41f6-4725-8944-f9e31b5ddbe6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.866907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwvt\" (UniqueName: \"kubernetes.io/projected/1df2d9d5-b12f-4312-9e66-581123788ac5-kube-api-access-plwvt\") pod \"migrator-59844c95c7-msk9m\" (UID: \"1df2d9d5-b12f-4312-9e66-581123788ac5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.867058 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9wqq\" (UniqueName: \"kubernetes.io/projected/040cee13-6799-4fe8-b64a-ad70d7b1185d-kube-api-access-b9wqq\") pod \"service-ca-operator-777779d784-l5jjr\" (UID: \"040cee13-6799-4fe8-b64a-ad70d7b1185d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.868257 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69d34ba5-eb17-4b36-b155-12a51c887d79-metrics-tls\") pod \"dns-operator-744455d44c-78xnn\" (UID: \"69d34ba5-eb17-4b36-b155-12a51c887d79\") " pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.868612 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.870519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqf8d\" (UniqueName: \"kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d\") pod \"console-f9d7485db-5jw55\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.870818 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxr5x\" (UniqueName: \"kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x\") pod \"auto-csr-approver-29565778-gls8c\" (UID: \"05cd9325-9740-4a70-98a7-3de9ebb30035\") " pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.872417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.874815 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/449dd8eb-1af2-4fec-b84f-42b567d38342-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w5g2b\" (UID: \"449dd8eb-1af2-4fec-b84f-42b567d38342\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.875378 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5ml\" (UniqueName: \"kubernetes.io/projected/caab64e7-53a2-46de-833b-2c55da422e4b-kube-api-access-qq5ml\") pod \"console-operator-58897d9998-dq52v\" (UID: \"caab64e7-53a2-46de-833b-2c55da422e4b\") " pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.875888 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nlv\" (UniqueName: \"kubernetes.io/projected/369378a6-ffae-46d6-bcce-575e033d51b9-kube-api-access-q6nlv\") pod \"openshift-apiserver-operator-796bbdcf4f-k4qmh\" (UID: \"369378a6-ffae-46d6-bcce-575e033d51b9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.875943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01738363-9e09-480f-b65d-a76f264289eb-srv-cert\") pod \"olm-operator-6b444d44fb-q7tcz\" (UID: \"01738363-9e09-480f-b65d-a76f264289eb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.876230 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rsx4\" (UniqueName: \"kubernetes.io/projected/3fd1faec-c933-445c-a6cd-71491f96a8dd-kube-api-access-6rsx4\") pod \"package-server-manager-789f6589d5-ds9rw\" (UID: \"3fd1faec-c933-445c-a6cd-71491f96a8dd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.876266 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e74482-1056-4bfe-995e-10ec0dd18796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kfrq6\" (UID: \"78e74482-1056-4bfe-995e-10ec0dd18796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.876882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49vz\" (UniqueName: \"kubernetes.io/projected/ac25d35e-036b-4d7f-a9e8-2417681f2a1c-kube-api-access-l49vz\") pod \"downloads-7954f5f757-x9vcw\" (UID: \"ac25d35e-036b-4d7f-a9e8-2417681f2a1c\") " pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.882620 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.883734 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.889474 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.893131 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kcmn4" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.898727 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.905836 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.912740 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.918959 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.940866 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.948951 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.955172 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.958962 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959154 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn6qc\" (UniqueName: \"kubernetes.io/projected/84dcb066-a28e-4283-ba72-b2958adec642-kube-api-access-tn6qc\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbw8\" (UniqueName: \"kubernetes.io/projected/a9dc5a6e-183c-4362-a581-598293c3310c-kube-api-access-kdbw8\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-cert\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.959284 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.459267912 +0000 UTC m=+120.564297761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959308 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88x8r\" (UniqueName: \"kubernetes.io/projected/b71787f5-93bf-434a-b67d-63be189d843e-kube-api-access-88x8r\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959328 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57t6b\" (UniqueName: \"kubernetes.io/projected/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-kube-api-access-57t6b\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959347 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-csi-data-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959365 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959385 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-default-certificate\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959420 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0fcc05b-da6b-436a-b895-8ae31a630fad-config-volume\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84dcb066-a28e-4283-ba72-b2958adec642-tmpfs\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959471 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-service-ca-bundle\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959489 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b71787f5-93bf-434a-b67d-63be189d843e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.959580 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960079 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-csi-data-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960358 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0fcc05b-da6b-436a-b895-8ae31a630fad-config-volume\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960713 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/84dcb066-a28e-4283-ba72-b2958adec642-tmpfs\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960747 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5px77\" (UniqueName: \"kubernetes.io/projected/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-kube-api-access-5px77\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4bt\" (UniqueName: \"kubernetes.io/projected/e2933377-6670-4f1f-b5d7-ebe2a832c460-kube-api-access-hx4bt\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a4c325-85dd-463a-ae31-c726008e9b08-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960818 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf7nb\" (UniqueName: \"kubernetes.io/projected/15a23e16-4194-4773-a4e8-1c3515d31c5c-kube-api-access-jf7nb\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a9dc5a6e-183c-4362-a581-598293c3310c-signing-key\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a9dc5a6e-183c-4362-a581-598293c3310c-signing-cabundle\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960878 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0fcc05b-da6b-436a-b895-8ae31a630fad-metrics-tls\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960894 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-certs\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960911 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-mountpoint-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960929 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbq46\" (UniqueName: \"kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960949 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-node-bootstrap-token\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-metrics-certs\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.960988 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-registration-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-proxy-tls\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961050 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql7l\" (UniqueName: \"kubernetes.io/projected/b0fcc05b-da6b-436a-b895-8ae31a630fad-kube-api-access-5ql7l\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961067 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxp5s\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-kube-api-access-sxp5s\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961104 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjlj\" (UniqueName: \"kubernetes.io/projected/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-kube-api-access-6vjlj\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961122 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-webhook-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-plugins-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961156 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-stats-auth\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961176 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-socket-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4c325-85dd-463a-ae31-c726008e9b08-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.961426 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.962236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-service-ca-bundle\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.962403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.965182 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-registration-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.965253 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-mountpoint-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.965474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-plugins-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.966614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1a4c325-85dd-463a-ae31-c726008e9b08-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:29 crc kubenswrapper[5033]: E0319 18:58:29.966934 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.466918512 +0000 UTC m=+120.571948491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.969358 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-proxy-tls\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.971085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-webhook-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.971255 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15a23e16-4194-4773-a4e8-1c3515d31c5c-socket-dir\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.971355 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a9dc5a6e-183c-4362-a581-598293c3310c-signing-key\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.971800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-certs\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.972181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a9dc5a6e-183c-4362-a581-598293c3310c-signing-cabundle\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.972576 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.972978 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b71787f5-93bf-434a-b67d-63be189d843e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.972991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-default-certificate\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.973051 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84dcb066-a28e-4283-ba72-b2958adec642-apiservice-cert\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.973399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-cert\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.975297 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-metrics-certs\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.975412 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.976189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-stats-auth\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.979825 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e2933377-6670-4f1f-b5d7-ebe2a832c460-node-bootstrap-token\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.985428 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57t6b\" (UniqueName: \"kubernetes.io/projected/a54f7af1-8f53-4aa5-8282-3f19aa50e57e-kube-api-access-57t6b\") pod \"ingress-canary-9gmzg\" (UID: \"a54f7af1-8f53-4aa5-8282-3f19aa50e57e\") " pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.986137 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88x8r\" (UniqueName: \"kubernetes.io/projected/b71787f5-93bf-434a-b67d-63be189d843e-kube-api-access-88x8r\") pod \"control-plane-machine-set-operator-78cbb6b69f-75f9n\" (UID: \"b71787f5-93bf-434a-b67d-63be189d843e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.989809 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbw8\" (UniqueName: \"kubernetes.io/projected/a9dc5a6e-183c-4362-a581-598293c3310c-kube-api-access-kdbw8\") pod \"service-ca-9c57cc56f-f69v2\" (UID: \"a9dc5a6e-183c-4362-a581-598293c3310c\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.989977 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn6qc\" (UniqueName: \"kubernetes.io/projected/84dcb066-a28e-4283-ba72-b2958adec642-kube-api-access-tn6qc\") pod \"packageserver-d55dfcdfc-n8gqn\" (UID: \"84dcb066-a28e-4283-ba72-b2958adec642\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.991159 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" Mar 19 18:58:29 crc kubenswrapper[5033]: I0319 18:58:29.994885 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.000077 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a4c325-85dd-463a-ae31-c726008e9b08-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.007823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.007957 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0fcc05b-da6b-436a-b895-8ae31a630fad-metrics-tls\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.020092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql7l\" (UniqueName: \"kubernetes.io/projected/b0fcc05b-da6b-436a-b895-8ae31a630fad-kube-api-access-5ql7l\") pod \"dns-default-fvwzx\" (UID: \"b0fcc05b-da6b-436a-b895-8ae31a630fad\") " pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.020101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbq46\" (UniqueName: \"kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46\") pod \"cni-sysctl-allowlist-ds-ntlnh\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.020291 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjlj\" (UniqueName: \"kubernetes.io/projected/4ee36317-5c1d-4ef5-a81f-c7d96e18e506-kube-api-access-6vjlj\") pod \"router-default-5444994796-hb9f7\" (UID: \"4ee36317-5c1d-4ef5-a81f-c7d96e18e506\") " pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.021221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf7nb\" (UniqueName: \"kubernetes.io/projected/15a23e16-4194-4773-a4e8-1c3515d31c5c-kube-api-access-jf7nb\") pod \"csi-hostpathplugin-g9hmn\" (UID: \"15a23e16-4194-4773-a4e8-1c3515d31c5c\") " pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.027512 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gmzg" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.029237 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4bt\" (UniqueName: \"kubernetes.io/projected/e2933377-6670-4f1f-b5d7-ebe2a832c460-kube-api-access-hx4bt\") pod \"machine-config-server-d94q2\" (UID: \"e2933377-6670-4f1f-b5d7-ebe2a832c460\") " pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.029321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxp5s\" (UniqueName: \"kubernetes.io/projected/e1a4c325-85dd-463a-ae31-c726008e9b08-kube-api-access-sxp5s\") pod \"cluster-image-registry-operator-dc59b4c8b-757h6\" (UID: \"e1a4c325-85dd-463a-ae31-c726008e9b08\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.029593 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.034034 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5px77\" (UniqueName: \"kubernetes.io/projected/f5c9eadf-046d-4332-9664-e9b48f0e3ad0-kube-api-access-5px77\") pod \"machine-config-controller-84d6567774-bppkn\" (UID: \"f5c9eadf-046d-4332-9664-e9b48f0e3ad0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.044947 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.052906 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.059571 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.062715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.063393 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.563371936 +0000 UTC m=+120.668401785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.084706 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.090743 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.097917 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.104654 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.111789 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.122898 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.132473 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.147400 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.154306 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.166748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.174280 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.674264032 +0000 UTC m=+120.779293871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.214359 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" event={"ID":"e1f5e565-ab78-4e77-9cd5-17fab05529cb","Type":"ContainerStarted","Data":"1989c668045675f02bac9676fa62a54853423880d09e592549cda35741378785"} Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.223805 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" event={"ID":"fe2b65ba-5c7e-499d-8d55-e09ec227e12c","Type":"ContainerStarted","Data":"2bfa63367947f0a4b9d8387637c11a193eefbeccfeb2567c43b82455547d1801"} Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.223864 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" event={"ID":"fe2b65ba-5c7e-499d-8d55-e09ec227e12c","Type":"ContainerStarted","Data":"2062aa4966144f9f7d59fae5339ebc5ef45bd34c83b1627646cd560194d1c413"} Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.238191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" event={"ID":"82f15cf6-4ccb-45ef-ab62-87a903e0e29d","Type":"ContainerStarted","Data":"697978381d74f7b67076a0572aa39b6669de41d4830c2b0bf653074b65781306"} Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.245973 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.257045 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.265132 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" podStartSLOduration=55.265115227 podStartE2EDuration="55.265115227s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:30.263832144 +0000 UTC m=+120.368862003" watchObservedRunningTime="2026-03-19 18:58:30.265115227 +0000 UTC m=+120.370145076" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.277984 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.281166 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.780736048 +0000 UTC m=+120.885765897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.290327 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.291384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.297203 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.797188367 +0000 UTC m=+120.902218216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.298953 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.308871 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.313574 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.332373 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d94q2" Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.392896 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.393149 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.893133295 +0000 UTC m=+120.998163144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: W0319 18:58:30.429864 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee36317_5c1d_4ef5_a81f_c7d96e18e506.slice/crio-ae1ab0072db7304c6624b59dcb722cec82b9ec0147505253eadb9d8c5dd7233a WatchSource:0}: Error finding container ae1ab0072db7304c6624b59dcb722cec82b9ec0147505253eadb9d8c5dd7233a: Status 404 returned error can't find the container with id ae1ab0072db7304c6624b59dcb722cec82b9ec0147505253eadb9d8c5dd7233a Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.494486 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.494876 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:30.99486398 +0000 UTC m=+121.099893829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.600687 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.601184 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.10116053 +0000 UTC m=+121.206190379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.708549 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.709342 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.209328263 +0000 UTC m=+121.314358112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.811346 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.811500 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.311472662 +0000 UTC m=+121.416502511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.811599 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.811898 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.311886206 +0000 UTC m=+121.416916045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.912276 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.912364 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.412346668 +0000 UTC m=+121.517376517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:30 crc kubenswrapper[5033]: I0319 18:58:30.913029 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:30 crc kubenswrapper[5033]: E0319 18:58:30.913512 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.413495637 +0000 UTC m=+121.518525486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.014775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.015630 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.515609684 +0000 UTC m=+121.620639543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.029303 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2933377_6670_4f1f_b5d7_ebe2a832c460.slice/crio-428cbfa64a9656c1e60be742dc4686ea4f4e48065300ff8a19a73dd8c9107416.scope\": RecentStats: unable to find data in memory cache]" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.117125 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.117414 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.617401861 +0000 UTC m=+121.722431710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.161494 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.162371 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.165246 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.165719 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.176021 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.217981 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.218030 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dzpft"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.218149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.218182 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.718158853 +0000 UTC m=+121.823188752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.218213 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.218265 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.218543 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.718531036 +0000 UTC m=+121.823560885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: W0319 18:58:31.254294 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aea5341_06fc_4b54_90a5_860a408e6759.slice/crio-0f1569abfa8005a46d3a404c3d728aae58716f98bbc3b797a6351d5d65246041 WatchSource:0}: Error finding container 0f1569abfa8005a46d3a404c3d728aae58716f98bbc3b797a6351d5d65246041: Status 404 returned error can't find the container with id 0f1569abfa8005a46d3a404c3d728aae58716f98bbc3b797a6351d5d65246041 Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.257427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" event={"ID":"e1f5e565-ab78-4e77-9cd5-17fab05529cb","Type":"ContainerStarted","Data":"d7a7b41f89d297ae80d4ebba17b70b5bdb6ab41708bd1c8f56cff1d54e3f531d"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.279341 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" event={"ID":"fe2b65ba-5c7e-499d-8d55-e09ec227e12c","Type":"ContainerStarted","Data":"6e426d058f20e644533583b4551031692bc88cb15d368821c969715782f41264"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.295821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hb9f7" event={"ID":"4ee36317-5c1d-4ef5-a81f-c7d96e18e506","Type":"ContainerStarted","Data":"360eaa450456cc5012a978b1c48879ff669a2b2eb894b53349c0b647353348f9"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.295910 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hb9f7" event={"ID":"4ee36317-5c1d-4ef5-a81f-c7d96e18e506","Type":"ContainerStarted","Data":"ae1ab0072db7304c6624b59dcb722cec82b9ec0147505253eadb9d8c5dd7233a"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.301129 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" podStartSLOduration=57.3011047 podStartE2EDuration="57.3011047s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:31.296564576 +0000 UTC m=+121.401594435" watchObservedRunningTime="2026-03-19 18:58:31.3011047 +0000 UTC m=+121.406134569" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.302384 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d99e721f162aaee4d89cddd3f5677f5d68f69e56ed3af82c50eeac4b48fbf5d5"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.312403 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.312487 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" event={"ID":"0fb8fbf8-b29b-4b07-a598-915a2c65affa","Type":"ContainerStarted","Data":"9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.312514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" event={"ID":"0fb8fbf8-b29b-4b07-a598-915a2c65affa","Type":"ContainerStarted","Data":"7827a311424867be68db5eed4736d9b251f516108bfa5d2cbe59a73184da0a66"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.313687 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.319242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.319335 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.319376 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.319497 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.319582 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.320028 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.820005062 +0000 UTC m=+121.925034971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.320559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.323541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.329665 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hb9f7" podStartSLOduration=56.329649779 podStartE2EDuration="56.329649779s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:31.320411176 +0000 UTC m=+121.425441025" watchObservedRunningTime="2026-03-19 18:58:31.329649779 +0000 UTC m=+121.434679628" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.330507 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-gls8c"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.345517 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d94q2" event={"ID":"e2933377-6670-4f1f-b5d7-ebe2a832c460","Type":"ContainerStarted","Data":"428cbfa64a9656c1e60be742dc4686ea4f4e48065300ff8a19a73dd8c9107416"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.345580 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d94q2" event={"ID":"e2933377-6670-4f1f-b5d7-ebe2a832c460","Type":"ContainerStarted","Data":"5bb170104d29d027df8ad1701e1fff6672d53d237b4aebc4a1d1a0420723391a"} Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.366895 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.375611 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mvtt7" podStartSLOduration=57.375593469 podStartE2EDuration="57.375593469s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:31.34528704 +0000 UTC m=+121.450316889" watchObservedRunningTime="2026-03-19 18:58:31.375593469 +0000 UTC m=+121.480623318" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.379221 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.382787 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.427179 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podStartSLOduration=10.427154761 podStartE2EDuration="10.427154761s" podCreationTimestamp="2026-03-19 18:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:31.385553538 +0000 UTC m=+121.490583387" watchObservedRunningTime="2026-03-19 18:58:31.427154761 +0000 UTC m=+121.532184610" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.427255 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.429643 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g9hmn"] Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.429837 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:31.929825411 +0000 UTC m=+122.034855260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.430519 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-d94q2" podStartSLOduration=10.430504344 podStartE2EDuration="10.430504344s" podCreationTimestamp="2026-03-19 18:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:31.411293632 +0000 UTC m=+121.516323481" watchObservedRunningTime="2026-03-19 18:58:31.430504344 +0000 UTC m=+121.535534193" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.435773 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kcmn4"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.442885 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.445221 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.490829 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.500366 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.511091 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.529761 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.529929 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.02989879 +0000 UTC m=+122.134928639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.530170 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.530881 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.030872373 +0000 UTC m=+122.135902222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.560501 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8k27t"] Mar 19 18:58:31 crc kubenswrapper[5033]: W0319 18:58:31.566847 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce5d4445_04fd_4c00_8b1e_393386cc78ad.slice/crio-91430cedea52a863238a7b873da6ca75cf930deee7791d012b3c6df9956382a2 WatchSource:0}: Error finding container 91430cedea52a863238a7b873da6ca75cf930deee7791d012b3c6df9956382a2: Status 404 returned error can't find the container with id 91430cedea52a863238a7b873da6ca75cf930deee7791d012b3c6df9956382a2 Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.631853 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.632192 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.132178093 +0000 UTC m=+122.237207942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.699969 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.705727 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.738627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.739064 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.239047402 +0000 UTC m=+122.344077251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.839165 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.839937 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.339922298 +0000 UTC m=+122.444952147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: W0319 18:58:31.867959 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df2d9d5_b12f_4312_9e66_581123788ac5.slice/crio-4db981533130df4a78114f70f13c910fcf79c7bf8dafbc348330332db504c172 WatchSource:0}: Error finding container 4db981533130df4a78114f70f13c910fcf79c7bf8dafbc348330332db504c172: Status 404 returned error can't find the container with id 4db981533130df4a78114f70f13c910fcf79c7bf8dafbc348330332db504c172 Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.873653 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69v2"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.898544 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.917006 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.918632 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.931028 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x9vcw"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.938428 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-78xnn"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.941013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:31 crc kubenswrapper[5033]: E0319 18:58:31.941306 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.441291581 +0000 UTC m=+122.546321430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.950817 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fvwzx"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.963219 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm"] Mar 19 18:58:31 crc kubenswrapper[5033]: W0319 18:58:31.964784 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040cee13_6799_4fe8_b64a_ad70d7b1185d.slice/crio-b57efd2fa3e54edb4563ebbac50bd4dbdc63000d5329f538757fc1608c38454b WatchSource:0}: Error finding container b57efd2fa3e54edb4563ebbac50bd4dbdc63000d5329f538757fc1608c38454b: Status 404 returned error can't find the container with id b57efd2fa3e54edb4563ebbac50bd4dbdc63000d5329f538757fc1608c38454b Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.989898 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.989958 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6"] Mar 19 18:58:31 crc kubenswrapper[5033]: I0319 18:58:31.998715 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.008513 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cz22k"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.020476 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.042466 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.043346 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.543324176 +0000 UTC m=+122.648354025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.058440 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dq52v"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.062792 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.065041 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.068628 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gmzg"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.070841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n"] Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.090620 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac25d35e_036b_4d7f_a9e8_2417681f2a1c.slice/crio-f9a09c59f38d5a2b4002fe08be268fc003f47af62936e9d4ce5c05a129d01b10 WatchSource:0}: Error finding container f9a09c59f38d5a2b4002fe08be268fc003f47af62936e9d4ce5c05a129d01b10: Status 404 returned error can't find the container with id f9a09c59f38d5a2b4002fe08be268fc003f47af62936e9d4ce5c05a129d01b10 Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.097858 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb71787f5_93bf_434a_b67d_63be189d843e.slice/crio-ba4d40fdb6cf2eb1adb44a46e61ce2d1d5abf1b9315073d8a5586b257bc585c9 WatchSource:0}: Error finding container ba4d40fdb6cf2eb1adb44a46e61ce2d1d5abf1b9315073d8a5586b257bc585c9: Status 404 returned error can't find the container with id ba4d40fdb6cf2eb1adb44a46e61ce2d1d5abf1b9315073d8a5586b257bc585c9 Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.117473 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73127891_1d5d_4371_87c0_82245ab12d5d.slice/crio-fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f WatchSource:0}: Error finding container fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f: Status 404 returned error can't find the container with id fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.118856 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d34ba5_eb17_4b36_b155_12a51c887d79.slice/crio-b00c0bd0b7e409371b0dd0fa72792b148625fb799f44324a83b46ac55eb7acce WatchSource:0}: Error finding container b00c0bd0b7e409371b0dd0fa72792b148625fb799f44324a83b46ac55eb7acce: Status 404 returned error can't find the container with id b00c0bd0b7e409371b0dd0fa72792b148625fb799f44324a83b46ac55eb7acce Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.127224 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449dd8eb_1af2_4fec_b84f_42b567d38342.slice/crio-ace1ac37d85611cb3e87b2573c67c49312373d9ea0bdc047adc7365bcad9d0ed WatchSource:0}: Error finding container ace1ac37d85611cb3e87b2573c67c49312373d9ea0bdc047adc7365bcad9d0ed: Status 404 returned error can't find the container with id ace1ac37d85611cb3e87b2573c67c49312373d9ea0bdc047adc7365bcad9d0ed Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.130760 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e74482_1056_4bfe_995e_10ec0dd18796.slice/crio-f005b817366a86529bfcee0b5a30efe2cc49e1cec341656442ea1fe702cc5259 WatchSource:0}: Error finding container f005b817366a86529bfcee0b5a30efe2cc49e1cec341656442ea1fe702cc5259: Status 404 returned error can't find the container with id f005b817366a86529bfcee0b5a30efe2cc49e1cec341656442ea1fe702cc5259 Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.134491 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9837c3e1_e614_408e_8914_c1390367407f.slice/crio-9d0f68dbfa554d3e422a582c0a3e3cf34c3299cd251d5898729ea11a79024ee5 WatchSource:0}: Error finding container 9d0f68dbfa554d3e422a582c0a3e3cf34c3299cd251d5898729ea11a79024ee5: Status 404 returned error can't find the container with id 9d0f68dbfa554d3e422a582c0a3e3cf34c3299cd251d5898729ea11a79024ee5 Mar 19 18:58:32 crc kubenswrapper[5033]: W0319 18:58:32.139738 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54f7af1_8f53_4aa5_8282_3f19aa50e57e.slice/crio-40955ffaf18f97788c2f3e28dc54de270d49f8d7265e149a5a6304ba817c27d6 WatchSource:0}: Error finding container 40955ffaf18f97788c2f3e28dc54de270d49f8d7265e149a5a6304ba817c27d6: Status 404 returned error can't find the container with id 40955ffaf18f97788c2f3e28dc54de270d49f8d7265e149a5a6304ba817c27d6 Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.145645 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.146047 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.646027024 +0000 UTC m=+122.751056873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.247538 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.247800 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.747786519 +0000 UTC m=+122.852816368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.263248 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-ntlnh"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.276186 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.316825 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:32 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:32 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:32 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.317080 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.350983 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.351234 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.851186481 +0000 UTC m=+122.956216330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.352608 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" event={"ID":"449dd8eb-1af2-4fec-b84f-42b567d38342","Type":"ContainerStarted","Data":"ace1ac37d85611cb3e87b2573c67c49312373d9ea0bdc047adc7365bcad9d0ed"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.354726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" event={"ID":"69d34ba5-eb17-4b36-b155-12a51c887d79","Type":"ContainerStarted","Data":"b00c0bd0b7e409371b0dd0fa72792b148625fb799f44324a83b46ac55eb7acce"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.357469 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" event={"ID":"01738363-9e09-480f-b65d-a76f264289eb","Type":"ContainerStarted","Data":"2d99144c5fff871c960a7df1af9189bfa42fc2e31470e880cc7293010698bb3c"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.357497 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" event={"ID":"01738363-9e09-480f-b65d-a76f264289eb","Type":"ContainerStarted","Data":"a8e1e54543b90e1df3cfbe771016059c649975decac18381d5b8d385534a588d"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.358286 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.361245 5033 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q7tcz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.361307 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" podUID="01738363-9e09-480f-b65d-a76f264289eb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.369740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" event={"ID":"e7c16707-41f6-4725-8944-f9e31b5ddbe6","Type":"ContainerStarted","Data":"6b1a78d4328ac0e9a98b1cee5dd78b61dd22e24baecaea977f3e108911431372"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.369800 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.369813 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" event={"ID":"e7c16707-41f6-4725-8944-f9e31b5ddbe6","Type":"ContainerStarted","Data":"9bb7168be0ca1472e22d0cefffb73664e558b59f229908a8fba761090213796f"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.372248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" event={"ID":"2fdbed76-a278-424a-8c8a-74d11ac4195e","Type":"ContainerStarted","Data":"2bed2ba40c8416ac7a2cc6008287f5f86216b81d85879b7e529fb22b6d08e4cc"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.376748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-gls8c" event={"ID":"05cd9325-9740-4a70-98a7-3de9ebb30035","Type":"ContainerStarted","Data":"82e57b8f88c5825e9a8bc576ca0bf3d5a54f57a7aae1cecbc76e4993030c0fd8"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.378011 5033 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2pw6j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.378061 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" podUID="e7c16707-41f6-4725-8944-f9e31b5ddbe6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.379615 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" podStartSLOduration=57.379604516 podStartE2EDuration="57.379604516s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.37885543 +0000 UTC m=+122.483885290" watchObservedRunningTime="2026-03-19 18:58:32.379604516 +0000 UTC m=+122.484634365" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.381980 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" event={"ID":"73127891-1d5d-4371-87c0-82245ab12d5d","Type":"ContainerStarted","Data":"fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.399819 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bbd8f53ac1c5a8f565638f6c783c289913601b5839715c64911fc07947e928e1"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.401680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" event={"ID":"040cee13-6799-4fe8-b64a-ad70d7b1185d","Type":"ContainerStarted","Data":"b57efd2fa3e54edb4563ebbac50bd4dbdc63000d5329f538757fc1608c38454b"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.404034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" event={"ID":"1df2d9d5-b12f-4312-9e66-581123788ac5","Type":"ContainerStarted","Data":"4db981533130df4a78114f70f13c910fcf79c7bf8dafbc348330332db504c172"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.410630 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" podStartSLOduration=57.410614769 podStartE2EDuration="57.410614769s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.410162754 +0000 UTC m=+122.515192603" watchObservedRunningTime="2026-03-19 18:58:32.410614769 +0000 UTC m=+122.515644618" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.424118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" event={"ID":"a9dc5a6e-183c-4362-a581-598293c3310c","Type":"ContainerStarted","Data":"15e7fe55b740f8163e70a93652fcc32c7b64e108163f0fbefb160249cf9b25e5"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.435877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dq52v" event={"ID":"caab64e7-53a2-46de-833b-2c55da422e4b","Type":"ContainerStarted","Data":"ff37b11e9e76cc0541bf814a5b7d2dca6718c851f61f8dc8a12ed3f18085cb33"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.444672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jw55" event={"ID":"9837c3e1-e614-408e-8914-c1390367407f","Type":"ContainerStarted","Data":"9d0f68dbfa554d3e422a582c0a3e3cf34c3299cd251d5898729ea11a79024ee5"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.452970 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.454670 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:32.954645834 +0000 UTC m=+123.059675693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.465008 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" event={"ID":"15a23e16-4194-4773-a4e8-1c3515d31c5c","Type":"ContainerStarted","Data":"9b1c13274a80b4a298de62bcfb2809cdbf12ce7e2fcf20bdeaed93ea50b7dab4"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.486944 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" podStartSLOduration=57.486929921 podStartE2EDuration="57.486929921s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.450526914 +0000 UTC m=+122.555556763" watchObservedRunningTime="2026-03-19 18:58:32.486929921 +0000 UTC m=+122.591959770" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.487756 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" event={"ID":"78e74482-1056-4bfe-995e-10ec0dd18796","Type":"ContainerStarted","Data":"f005b817366a86529bfcee0b5a30efe2cc49e1cec341656442ea1fe702cc5259"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.496311 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" event={"ID":"84e2fe23-8f48-484f-a955-85ea15c3c1fa","Type":"ContainerStarted","Data":"99e33b5dc814e66823b2ad6e007ceabe5298222ee1b87999e86509db15a0c885"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.514164 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" event={"ID":"3aea5341-06fc-4b54-90a5-860a408e6759","Type":"ContainerStarted","Data":"5da839553a2f6948be8b9b6932c92f59f95cba9172517d02e2e1dcf897ef62ff"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.514209 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" event={"ID":"3aea5341-06fc-4b54-90a5-860a408e6759","Type":"ContainerStarted","Data":"0f1569abfa8005a46d3a404c3d728aae58716f98bbc3b797a6351d5d65246041"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.537728 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" event={"ID":"1e837d74-e49f-4138-aba8-8170e284aeb3","Type":"ContainerStarted","Data":"f5820b00c52af48287d837fb3f6991ce5265fd2b5aaaa21955f04ec81e90e555"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.537770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" event={"ID":"1e837d74-e49f-4138-aba8-8170e284aeb3","Type":"ContainerStarted","Data":"0f99c4d51de91506f2efbbb06736a93177db24121d290ae2f91aa8c5c8494f9a"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.537781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" event={"ID":"1e837d74-e49f-4138-aba8-8170e284aeb3","Type":"ContainerStarted","Data":"13e7585cbf1748ad2a71d32205cbabed135ba61fbe0b680563fb8d08bb36d323"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.540419 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dzpft" podStartSLOduration=57.540408217 podStartE2EDuration="57.540408217s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.538412459 +0000 UTC m=+122.643442308" watchObservedRunningTime="2026-03-19 18:58:32.540408217 +0000 UTC m=+122.645438066" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.540551 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" podStartSLOduration=57.540547302 podStartE2EDuration="57.540547302s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.487647955 +0000 UTC m=+122.592677804" watchObservedRunningTime="2026-03-19 18:58:32.540547302 +0000 UTC m=+122.645577151" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.549644 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" event={"ID":"369378a6-ffae-46d6-bcce-575e033d51b9","Type":"ContainerStarted","Data":"dc9d9a535a84024176ce0b4a7dcd86447d0dcbdfefb24ce0b4d972c7c1d0708d"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.557115 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.559257 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.059244717 +0000 UTC m=+123.164274566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.563114 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d4vcj" podStartSLOduration=57.563099497 podStartE2EDuration="57.563099497s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.562560359 +0000 UTC m=+122.667590218" watchObservedRunningTime="2026-03-19 18:58:32.563099497 +0000 UTC m=+122.668129336" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.574261 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" event={"ID":"84dcb066-a28e-4283-ba72-b2958adec642","Type":"ContainerStarted","Data":"710428bc6b43e1f24341112d3715d0f73b1f941d21177245d334815199ab2e79"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.574293 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" event={"ID":"84dcb066-a28e-4283-ba72-b2958adec642","Type":"ContainerStarted","Data":"f540206c8fcc876620fd2fc3b6d5caddee7c4bd4173523f7114bc9353b671cd3"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.574640 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.578383 5033 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-n8gqn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.578431 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" podUID="84dcb066-a28e-4283-ba72-b2958adec642" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.583060 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerStarted","Data":"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.583201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerStarted","Data":"91430cedea52a863238a7b873da6ca75cf930deee7791d012b3c6df9956382a2"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.583945 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.600152 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" podStartSLOduration=57.600135815 podStartE2EDuration="57.600135815s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.598412287 +0000 UTC m=+122.703442136" watchObservedRunningTime="2026-03-19 18:58:32.600135815 +0000 UTC m=+122.705165664" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.613826 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" podStartSLOduration=57.613671765 podStartE2EDuration="57.613671765s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:32.61235366 +0000 UTC m=+122.717383509" watchObservedRunningTime="2026-03-19 18:58:32.613671765 +0000 UTC m=+122.718701614" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.616271 5033 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hqvsx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.616363 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.658510 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.658858 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.158832198 +0000 UTC m=+123.263862047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.659067 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.660413 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.160399082 +0000 UTC m=+123.265428931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.717362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x9vcw" event={"ID":"ac25d35e-036b-4d7f-a9e8-2417681f2a1c","Type":"ContainerStarted","Data":"f9a09c59f38d5a2b4002fe08be268fc003f47af62936e9d4ce5c05a129d01b10"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.735674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gmzg" event={"ID":"a54f7af1-8f53-4aa5-8282-3f19aa50e57e","Type":"ContainerStarted","Data":"40955ffaf18f97788c2f3e28dc54de270d49f8d7265e149a5a6304ba817c27d6"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.741686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" event={"ID":"3fd1faec-c933-445c-a6cd-71491f96a8dd","Type":"ContainerStarted","Data":"47520ed1433dc89a13bc5dde628da3dd4d97d175f679ae815e9137776e134c14"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.741727 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" event={"ID":"3fd1faec-c933-445c-a6cd-71491f96a8dd","Type":"ContainerStarted","Data":"e91dd86a5096257ffcdf9c8d4f6f9f676cb3a665f67289a039af44528002d819"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.749755 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" event={"ID":"f5c9eadf-046d-4332-9664-e9b48f0e3ad0","Type":"ContainerStarted","Data":"0ed2263e90eb2d09e1e85fa038d88ae06d2e8740e1ef1f1a6a29c15f177d0ee4"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.761003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.761309 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.261292258 +0000 UTC m=+123.366322107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.788374 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" event={"ID":"e1a4c325-85dd-463a-ae31-c726008e9b08","Type":"ContainerStarted","Data":"45b922ae4bcbfc050df44a1adc6705c70facf409c334e7179a29837d7bd97983"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.791892 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41800: no serving certificate available for the kubelet" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.801437 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" event={"ID":"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4","Type":"ContainerStarted","Data":"8bcd091b263e4a2b8bd8524341d36bf925e36012249f4c7f99e7181bc29a36d7"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.811700 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" event={"ID":"b71787f5-93bf-434a-b67d-63be189d843e","Type":"ContainerStarted","Data":"ba4d40fdb6cf2eb1adb44a46e61ce2d1d5abf1b9315073d8a5586b257bc585c9"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.822293 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvwzx" event={"ID":"b0fcc05b-da6b-436a-b895-8ae31a630fad","Type":"ContainerStarted","Data":"0bbc0968bb93f6190c54853b8a1fae2235a8c0945565a36ed850f9f68548a7c0"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.832968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"313305e24448bbc46e28e61d0903dde1f97cbac747f9ba33134c8f470b5af205"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.833018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"38bf829a68811f7b64608cbd8ff6a268581098ce99aee9054f15d6f238015dbf"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.862283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.862598 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.362584158 +0000 UTC m=+123.467614007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.890066 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41802: no serving certificate available for the kubelet" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.899437 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kcmn4" event={"ID":"5120920c-fe7c-454a-9dd5-9c0b79e0fb04","Type":"ContainerStarted","Data":"e185cf749c6079c28278f4d44070a200f37b3985695a9826bac7687755ffdb4f"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.899504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kcmn4" event={"ID":"5120920c-fe7c-454a-9dd5-9c0b79e0fb04","Type":"ContainerStarted","Data":"87fa7d289e2815be08d560590e2d3eb5ffc21621bc7d40b66672aa9f9089e326"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.915421 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9a852e1c10342f173aaab4219fbe57fc5f5ece5b359e600404fe67995b1ae913"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.915473 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0d1ca2fe15a7088a25a902e132c897a9786168365837b822f973d66511072238"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.915655 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.926901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" event={"ID":"6f8d44d7-9ede-4090-96da-f59dc7a10cf0","Type":"ContainerStarted","Data":"50ce033c29d2c32a7af68577dfd1b0f6e945950865ff0d7bd06678d1c53c0a67"} Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.963161 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.963637 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.463609839 +0000 UTC m=+123.568639688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.963855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:32 crc kubenswrapper[5033]: E0319 18:58:32.965175 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.465165882 +0000 UTC m=+123.570195731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:32 crc kubenswrapper[5033]: I0319 18:58:32.997041 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41810: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.067306 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.069034 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.569000348 +0000 UTC m=+123.674030197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.100641 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41820: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.169320 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.169752 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.669737989 +0000 UTC m=+123.774767838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.208560 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41826: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.273915 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.274364 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.774347852 +0000 UTC m=+123.879377701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.321846 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:33 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:33 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:33 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.322155 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.364971 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41834: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.375407 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.375886 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.875872669 +0000 UTC m=+123.980902518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.478385 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.479107 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:33.979084544 +0000 UTC m=+124.084114393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.566469 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41836: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.581594 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.581953 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.081940097 +0000 UTC m=+124.186969946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.657796 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.657843 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.676031 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.682405 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.682831 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.182811872 +0000 UTC m=+124.287841721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.784699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.785704 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.285682295 +0000 UTC m=+124.390712144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.885915 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.886176 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.386162478 +0000 UTC m=+124.491192327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.943921 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" event={"ID":"78e74482-1056-4bfe-995e-10ec0dd18796","Type":"ContainerStarted","Data":"59a7ad550be90a48d9a499f5b1817443bed657be9907e57b01b57a66f989d419"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.943965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" event={"ID":"78e74482-1056-4bfe-995e-10ec0dd18796","Type":"ContainerStarted","Data":"39e8ef7e9ac8c0e0dcd9d747c7d483aa091d0ecf4cc7630f5fa57ea75ad01b8d"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.948842 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" event={"ID":"369378a6-ffae-46d6-bcce-575e033d51b9","Type":"ContainerStarted","Data":"42a2e43dd73c2026c1008286a3f952d6feb8739062668e83ddc88ab6c1aace9d"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.951632 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41848: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.954793 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b2c0014-3594-464c-a0e8-8e4e36301dc1","Type":"ContainerStarted","Data":"3fdcb4ce4b83ac63158b60f230c982b00c8d572c3c25f4589d2ba22a47f19717"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.954835 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b2c0014-3594-464c-a0e8-8e4e36301dc1","Type":"ContainerStarted","Data":"b553c07e4c33c11380185381b03bc048307516dcfae9f29c0b41dceff58a93a6"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.973466 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" event={"ID":"f5c9eadf-046d-4332-9664-e9b48f0e3ad0","Type":"ContainerStarted","Data":"f86edef7dbf4d76f497d0ed1a796a12354b975fc60985553b5de1417d166d420"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.973526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" event={"ID":"f5c9eadf-046d-4332-9664-e9b48f0e3ad0","Type":"ContainerStarted","Data":"a8577cf90558fbc328940ee7548bfc53d0cf79c252590a5af0c1aef3a619ecd8"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.976784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" event={"ID":"6f8d44d7-9ede-4090-96da-f59dc7a10cf0","Type":"ContainerStarted","Data":"ac45b5c8b442614660053e3d58498071dc4791a0462119a65f3d11a0a2aa2737"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.979862 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvwzx" event={"ID":"b0fcc05b-da6b-436a-b895-8ae31a630fad","Type":"ContainerStarted","Data":"80e9ca5eaf4bc18ed7cbc91c503ed4b768572f1f00c25ff6709e2dfd7df493f3"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.979887 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fvwzx" event={"ID":"b0fcc05b-da6b-436a-b895-8ae31a630fad","Type":"ContainerStarted","Data":"c30e7ccab1ef2116e855743a22ddf56d3e87f21fc994b5243a942b737642e14c"} Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.980067 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.982675 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kfrq6" podStartSLOduration=58.982648024 podStartE2EDuration="58.982648024s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:33.982364685 +0000 UTC m=+124.087394534" watchObservedRunningTime="2026-03-19 18:58:33.982648024 +0000 UTC m=+124.087677873" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.988881 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:33 crc kubenswrapper[5033]: I0319 18:58:33.989550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" event={"ID":"e1a4c325-85dd-463a-ae31-c726008e9b08","Type":"ContainerStarted","Data":"d2f89546935304afe7c5f9d7e0f625fb307cf24fec7b6c3d4a0ba7aa0946bc36"} Mar 19 18:58:33 crc kubenswrapper[5033]: E0319 18:58:33.990187 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.49017419 +0000 UTC m=+124.595204029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.008191 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.008233 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.011836 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f69v2" event={"ID":"a9dc5a6e-183c-4362-a581-598293c3310c","Type":"ContainerStarted","Data":"946c6cfe2cc0b3a6db350cbc0ea1a6a4fbddd8b9c4c80c4df385a566159c383e"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.018286 5033 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2rhhh container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]log ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]etcd ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/max-in-flight-filter ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 18:58:34 crc kubenswrapper[5033]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-startinformers ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 18:58:34 crc kubenswrapper[5033]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 18:58:34 crc kubenswrapper[5033]: livez check failed Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.018357 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" podUID="e1f5e565-ab78-4e77-9cd5-17fab05529cb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.023609 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" event={"ID":"449dd8eb-1af2-4fec-b84f-42b567d38342","Type":"ContainerStarted","Data":"9384e417a0bc96aa230c8587f116fd59d6f458c532c7575203f036b5bd37697e"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.036801 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.036778723 podStartE2EDuration="3.036778723s" podCreationTimestamp="2026-03-19 18:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.034877908 +0000 UTC m=+124.139907757" watchObservedRunningTime="2026-03-19 18:58:34.036778723 +0000 UTC m=+124.141808572" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.042708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" event={"ID":"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4","Type":"ContainerStarted","Data":"01947136958c66729b244e6c96c4876172a0be1d52298863667e7fc440823e01"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.042770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" event={"ID":"91eba8c1-9e26-4e3f-b9a4-3c0c2d18b9f4","Type":"ContainerStarted","Data":"1994d3b9db0b633c4a01cb958879e47403d2bb62c4fdc390186cf9c4f5961361"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.082460 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x9vcw" event={"ID":"ac25d35e-036b-4d7f-a9e8-2417681f2a1c","Type":"ContainerStarted","Data":"4d13e6f3de57f14000805e841f837c237e44cbfa7b924cf3120d2fd1e4c55c7c"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.083284 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.086811 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-x9vcw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.087083 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x9vcw" podUID="ac25d35e-036b-4d7f-a9e8-2417681f2a1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.090646 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.091325 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.591297874 +0000 UTC m=+124.696327713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.092667 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.101765 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l7dwm" podStartSLOduration=59.101725098 podStartE2EDuration="59.101725098s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.101597394 +0000 UTC m=+124.206627253" watchObservedRunningTime="2026-03-19 18:58:34.101725098 +0000 UTC m=+124.206754947" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.106630 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.606598604 +0000 UTC m=+124.711628443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.132382 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" event={"ID":"84e2fe23-8f48-484f-a955-85ea15c3c1fa","Type":"ContainerStarted","Data":"c675682a6c747101088381a7e29d105da00bdc9999a7ffdbab0fc8a3f90ed54d"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.152114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-l5jjr" event={"ID":"040cee13-6799-4fe8-b64a-ad70d7b1185d","Type":"ContainerStarted","Data":"c65148bcc227fb77975a7e45a7d816834d56b759fa33b8c3849bcef829490ec7"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.166168 5033 generic.go:334] "Generic (PLEG): container finished" podID="2fdbed76-a278-424a-8c8a-74d11ac4195e" containerID="1615beb395cc6068728f88dcda14fd8fa80582264d02e34a573c6ea830e5651b" exitCode=0 Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.166255 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" event={"ID":"2fdbed76-a278-424a-8c8a-74d11ac4195e","Type":"ContainerDied","Data":"1615beb395cc6068728f88dcda14fd8fa80582264d02e34a573c6ea830e5651b"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.167099 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k4qmh" podStartSLOduration=60.167075388 podStartE2EDuration="1m0.167075388s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.132132601 +0000 UTC m=+124.237162450" watchObservedRunningTime="2026-03-19 18:58:34.167075388 +0000 UTC m=+124.272105237" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.172560 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fvwzx" podStartSLOduration=13.172515082 podStartE2EDuration="13.172515082s" podCreationTimestamp="2026-03-19 18:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.164948835 +0000 UTC m=+124.269978684" watchObservedRunningTime="2026-03-19 18:58:34.172515082 +0000 UTC m=+124.277544921" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.188783 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" event={"ID":"15a23e16-4194-4773-a4e8-1c3515d31c5c","Type":"ContainerStarted","Data":"a932d923b268628a29f740ff0c235d16d05c11fdc8372667a4c2f1b3b4640187"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.195086 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.196279 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.696263849 +0000 UTC m=+124.801293698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.220185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" event={"ID":"73127891-1d5d-4371-87c0-82245ab12d5d","Type":"ContainerStarted","Data":"d3c47cf3115467b6551c8ae8875310d6b60d9e4a0b4458cbfca0f09e4cadc039"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.224002 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dq52v" event={"ID":"caab64e7-53a2-46de-833b-2c55da422e4b","Type":"ContainerStarted","Data":"3b0cc42118f35d596171b04055a4e42f3ef8b96728f496e799ea14bb1d49f4b0"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.224036 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.241162 5033 patch_prober.go:28] interesting pod/console-operator-58897d9998-dq52v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.241215 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dq52v" podUID="caab64e7-53a2-46de-833b-2c55da422e4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.242869 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8k27t" podStartSLOduration=59.242854691 podStartE2EDuration="59.242854691s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.242470048 +0000 UTC m=+124.347499897" watchObservedRunningTime="2026-03-19 18:58:34.242854691 +0000 UTC m=+124.347884540" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.243390 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bppkn" podStartSLOduration=59.243385359 podStartE2EDuration="59.243385359s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.204203679 +0000 UTC m=+124.309233528" watchObservedRunningTime="2026-03-19 18:58:34.243385359 +0000 UTC m=+124.348415208" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.257964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jw55" event={"ID":"9837c3e1-e614-408e-8914-c1390367407f","Type":"ContainerStarted","Data":"2d5ad4bf99ff8b7260e0c685b4a4dfc5f50064566ad5715b7ca4dc626b378755"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.273255 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x9vcw" podStartSLOduration=60.273224923 podStartE2EDuration="1m0.273224923s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.270940205 +0000 UTC m=+124.375970054" watchObservedRunningTime="2026-03-19 18:58:34.273224923 +0000 UTC m=+124.378254762" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.276097 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" event={"ID":"b71787f5-93bf-434a-b67d-63be189d843e","Type":"ContainerStarted","Data":"9e24d71d361a1ddd3c54324be20ad7acea7c0a362ac1e660b7c9dab28ee4ac5e"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.299443 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.300516 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.800501979 +0000 UTC m=+124.905531828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.305695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" event={"ID":"69d34ba5-eb17-4b36-b155-12a51c887d79","Type":"ContainerStarted","Data":"f97bf589fb7194dfcae7c78de8600dbb63d5c6ddcdb0a0f74fe8bf190a219957"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.305741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" event={"ID":"69d34ba5-eb17-4b36-b155-12a51c887d79","Type":"ContainerStarted","Data":"4a81b2e090c9bffc89560b8852f025a0593f592b6e60b3e926b31dee7962d0aa"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.318800 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:34 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:34 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:34 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.318848 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.338403 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47hmx" podStartSLOduration=60.338383825 podStartE2EDuration="1m0.338383825s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.298752519 +0000 UTC m=+124.403782368" watchObservedRunningTime="2026-03-19 18:58:34.338383825 +0000 UTC m=+124.443413674" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.340922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kcmn4" event={"ID":"5120920c-fe7c-454a-9dd5-9c0b79e0fb04","Type":"ContainerStarted","Data":"64bcbd6872fe6183f24167a40d8f60fceea84119af2b45343b9acc6e052473c3"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.348606 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gmzg" event={"ID":"a54f7af1-8f53-4aa5-8282-3f19aa50e57e","Type":"ContainerStarted","Data":"3e1fa0c88fe4ca555216865899b1bd200d80481ecc878b7bc816e1cefbc16aa0"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.354824 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" event={"ID":"1df2d9d5-b12f-4312-9e66-581123788ac5","Type":"ContainerStarted","Data":"1e313a2a9cbdbb7ba8bd0a3e87012615211c8f30b48199e2a0c240ba189b7a51"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.354858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" event={"ID":"1df2d9d5-b12f-4312-9e66-581123788ac5","Type":"ContainerStarted","Data":"7d4c32fddca8830a88bea1f635fa2311dd50b55b7c47de6b33c98f7fb41c6cb2"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.368802 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" event={"ID":"3fd1faec-c933-445c-a6cd-71491f96a8dd","Type":"ContainerStarted","Data":"33ed8176845fcbdd7efe130e69450ffd149d4103edfa35fbe46c078c5bd786e0"} Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.368848 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.370904 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" gracePeriod=30 Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.371642 5033 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hqvsx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.371712 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.384324 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2pw6j" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.386619 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgbp" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.388073 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7tcz" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.401059 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.403116 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:34.903100883 +0000 UTC m=+125.008130722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.426638 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w5g2b" podStartSLOduration=59.426621832 podStartE2EDuration="59.426621832s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.421701935 +0000 UTC m=+124.526731784" watchObservedRunningTime="2026-03-19 18:58:34.426621832 +0000 UTC m=+124.531651681" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.428129 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-757h6" podStartSLOduration=60.428122673 podStartE2EDuration="1m0.428122673s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.383682524 +0000 UTC m=+124.488712383" watchObservedRunningTime="2026-03-19 18:58:34.428122673 +0000 UTC m=+124.533152522" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.495280 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" podStartSLOduration=59.495263083 podStartE2EDuration="59.495263083s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.477777889 +0000 UTC m=+124.582807738" watchObservedRunningTime="2026-03-19 18:58:34.495263083 +0000 UTC m=+124.600292932" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.495707 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.496646 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.499852 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.500249 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kcmn4" podStartSLOduration=60.500212211 podStartE2EDuration="1m0.500212211s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.497281192 +0000 UTC m=+124.602311041" watchObservedRunningTime="2026-03-19 18:58:34.500212211 +0000 UTC m=+124.605242070" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.505425 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.505825 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.005810281 +0000 UTC m=+125.110840130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.518789 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.545793 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-78xnn" podStartSLOduration=59.545774708 podStartE2EDuration="59.545774708s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.537803928 +0000 UTC m=+124.642833777" watchObservedRunningTime="2026-03-19 18:58:34.545774708 +0000 UTC m=+124.650804557" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.607509 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.607726 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.607761 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ps22\" (UniqueName: \"kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.607814 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.607929 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.107915459 +0000 UTC m=+125.212945308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.611822 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9gmzg" podStartSLOduration=13.611807751 podStartE2EDuration="13.611807751s" podCreationTimestamp="2026-03-19 18:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.566915356 +0000 UTC m=+124.671945215" watchObservedRunningTime="2026-03-19 18:58:34.611807751 +0000 UTC m=+124.716837600" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.642988 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" podStartSLOduration=60.642973919 podStartE2EDuration="1m0.642973919s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.613072964 +0000 UTC m=+124.718102813" watchObservedRunningTime="2026-03-19 18:58:34.642973919 +0000 UTC m=+124.748003768" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.654181 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41850: no serving certificate available for the kubelet" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.677717 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.678679 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.683276 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.713803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.713859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.713915 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnfqz\" (UniqueName: \"kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.713957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.714000 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.714026 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.714065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ps22\" (UniqueName: \"kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.714763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.715955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.716262 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.216243457 +0000 UTC m=+125.321273306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.720524 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5jw55" podStartSLOduration=60.720494542 podStartE2EDuration="1m0.720494542s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.700268505 +0000 UTC m=+124.805298364" watchObservedRunningTime="2026-03-19 18:58:34.720494542 +0000 UTC m=+124.825524411" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.741623 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.774950 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ps22\" (UniqueName: \"kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22\") pod \"community-operators-q2zhl\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.780653 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n8gqn" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.814906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.815173 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.815219 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnfqz\" (UniqueName: \"kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.815253 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.815690 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.815764 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.315748167 +0000 UTC m=+125.420778016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.815992 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.828436 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dq52v" podStartSLOduration=60.828418697000004 podStartE2EDuration="1m0.828418697s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.756713762 +0000 UTC m=+124.861743611" watchObservedRunningTime="2026-03-19 18:58:34.828418697 +0000 UTC m=+124.933448536" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.845846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnfqz\" (UniqueName: \"kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz\") pod \"certified-operators-hhm4g\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.852983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.859194 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-msk9m" podStartSLOduration=59.859178792 podStartE2EDuration="59.859178792s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.854554965 +0000 UTC m=+124.959584814" watchObservedRunningTime="2026-03-19 18:58:34.859178792 +0000 UTC m=+124.964208641" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.876242 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.877605 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.912468 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.918884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.918963 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.918993 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.919033 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:34 crc kubenswrapper[5033]: E0319 18:58:34.919339 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.419326464 +0000 UTC m=+125.524356313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:34 crc kubenswrapper[5033]: I0319 18:58:34.962774 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-75f9n" podStartSLOduration=59.962757809 podStartE2EDuration="59.962757809s" podCreationTimestamp="2026-03-19 18:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:34.929197879 +0000 UTC m=+125.034227738" watchObservedRunningTime="2026-03-19 18:58:34.962757809 +0000 UTC m=+125.067787658" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.002075 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.021513 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.021672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.021715 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.021784 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.022182 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.522163297 +0000 UTC m=+125.627193146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.023334 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.023790 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.052295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz\") pod \"community-operators-bhrx9\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.085368 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.086389 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.118006 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.125048 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.125485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.125538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.125567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfnnl\" (UniqueName: \"kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.125911 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.62589925 +0000 UTC m=+125.730929099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.217178 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.232408 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.232647 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.232903 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.232930 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfnnl\" (UniqueName: \"kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.232989 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.732974476 +0000 UTC m=+125.838004325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.233372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.233406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.289344 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfnnl\" (UniqueName: \"kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl\") pod \"certified-operators-np9gj\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.324197 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:35 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:35 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:35 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.324280 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.334180 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.334549 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.834536845 +0000 UTC m=+125.939566694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.419948 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.428594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" event={"ID":"2fdbed76-a278-424a-8c8a-74d11ac4195e","Type":"ContainerStarted","Data":"04f11406d46c2fcaa8d8da7afe3f1b5fb246be0f8b62f5af5531f1f30e45b810"} Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.428667 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.440027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.440305 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:35.940288966 +0000 UTC m=+126.045318815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.440315 5033 generic.go:334] "Generic (PLEG): container finished" podID="6b2c0014-3594-464c-a0e8-8e4e36301dc1" containerID="3fdcb4ce4b83ac63158b60f230c982b00c8d572c3c25f4589d2ba22a47f19717" exitCode=0 Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.440431 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b2c0014-3594-464c-a0e8-8e4e36301dc1","Type":"ContainerDied","Data":"3fdcb4ce4b83ac63158b60f230c982b00c8d572c3c25f4589d2ba22a47f19717"} Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.441186 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-x9vcw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.441222 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x9vcw" podUID="ac25d35e-036b-4d7f-a9e8-2417681f2a1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.443938 5033 patch_prober.go:28] interesting pod/console-operator-58897d9998-dq52v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.443980 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dq52v" podUID="caab64e7-53a2-46de-833b-2c55da422e4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.475523 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" podStartSLOduration=61.475500952 podStartE2EDuration="1m1.475500952s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:35.473784224 +0000 UTC m=+125.578814073" watchObservedRunningTime="2026-03-19 18:58:35.475500952 +0000 UTC m=+125.580530801" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.478970 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.542489 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.579981 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.07996105 +0000 UTC m=+126.184990899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.609239 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.646422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.647220 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.147199893 +0000 UTC m=+126.252229742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.681306 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.753081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.754171 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.254151655 +0000 UTC m=+126.359181504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.835946 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.860496 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.860668 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.360638211 +0000 UTC m=+126.465668060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.860744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.861043 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.361035135 +0000 UTC m=+126.466064984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:35 crc kubenswrapper[5033]: W0319 18:58:35.898407 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86345a30_6cc0_4359_8051_e508c71833f4.slice/crio-1ed49b2d2c6e0f3040b0b800f6960989c5e1bbaf71098993481981a5ef4dc05d WatchSource:0}: Error finding container 1ed49b2d2c6e0f3040b0b800f6960989c5e1bbaf71098993481981a5ef4dc05d: Status 404 returned error can't find the container with id 1ed49b2d2c6e0f3040b0b800f6960989c5e1bbaf71098993481981a5ef4dc05d Mar 19 18:58:35 crc kubenswrapper[5033]: I0319 18:58:35.964395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:35 crc kubenswrapper[5033]: E0319 18:58:35.964750 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.464734657 +0000 UTC m=+126.569764506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.008053 5033 ???:1] "http: TLS handshake error from 192.168.126.11:41864: no serving certificate available for the kubelet" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.068836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.069200 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.569183284 +0000 UTC m=+126.674213133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.073808 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.170213 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.170373 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.670346249 +0000 UTC m=+126.775376088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.170499 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.170750 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.670740192 +0000 UTC m=+126.775770041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.231275 5033 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.271588 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.271737 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.771715142 +0000 UTC m=+126.876744991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.272053 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.272326 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.772314782 +0000 UTC m=+126.877344631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.294739 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.294941 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerName="controller-manager" containerID="cri-o://9b7f72059c52fd7f7137968fb0f6dc6285a93d73ec410732ac3fe104eb540603" gracePeriod=30 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.309050 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.309238 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" podUID="f123389a-beda-4156-bd02-56ccf2a479f1" containerName="route-controller-manager" containerID="cri-o://0fe1b544c17f74855bb0c576294157fc23c2333aa9087cc054bce5f18662842e" gracePeriod=30 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.317918 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:36 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:36 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:36 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.318410 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.373305 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.373440 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.873422456 +0000 UTC m=+126.978452295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.373513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.373809 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.873801559 +0000 UTC m=+126.978831408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.389965 5033 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.474381 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.474595 5033 generic.go:334] "Generic (PLEG): container finished" podID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerID="9b7f72059c52fd7f7137968fb0f6dc6285a93d73ec410732ac3fe104eb540603" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.474631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" event={"ID":"dabe1e76-f401-4d8c-99a4-36f7acc7e241","Type":"ContainerDied","Data":"9b7f72059c52fd7f7137968fb0f6dc6285a93d73ec410732ac3fe104eb540603"} Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.474711 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:36.974672094 +0000 UTC m=+127.079701943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.482107 5033 generic.go:334] "Generic (PLEG): container finished" podID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerID="4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.482152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerDied","Data":"4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.482189 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerStarted","Data":"eff53e488b55902d1f03c66ebbbdf2c5905f4732b5ac731df901ac34619fc8fb"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.483860 5033 generic.go:334] "Generic (PLEG): container finished" podID="86345a30-6cc0-4359-8051-e508c71833f4" containerID="03de50e51d88a9f6075957331348b2d126d597f2ef83383125b95a6d01c7d017" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.483935 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerDied","Data":"03de50e51d88a9f6075957331348b2d126d597f2ef83383125b95a6d01c7d017"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.483968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerStarted","Data":"1ed49b2d2c6e0f3040b0b800f6960989c5e1bbaf71098993481981a5ef4dc05d"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.519661 5033 generic.go:334] "Generic (PLEG): container finished" podID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerID="37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.519722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerDied","Data":"37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.519746 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerStarted","Data":"184926c058485d1aeafbf8355983ef4916ea6fe35b71d55cf278023ab36b1c46"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.546224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" event={"ID":"15a23e16-4194-4773-a4e8-1c3515d31c5c","Type":"ContainerStarted","Data":"6faa0dfcf2f4cdf132a6191aa1d8ff99fef030b46e8115b9e5b78335624dd681"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.546273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" event={"ID":"15a23e16-4194-4773-a4e8-1c3515d31c5c","Type":"ContainerStarted","Data":"85c91781e4114d995336306d23dd879fc36e47ee6c1270b3a58700fdbb7e8f7d"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.576283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.576581 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:37.076570035 +0000 UTC m=+127.181599884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sl57l" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.615706 5033 generic.go:334] "Generic (PLEG): container finished" podID="f123389a-beda-4156-bd02-56ccf2a479f1" containerID="0fe1b544c17f74855bb0c576294157fc23c2333aa9087cc054bce5f18662842e" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.615784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" event={"ID":"f123389a-beda-4156-bd02-56ccf2a479f1","Type":"ContainerDied","Data":"0fe1b544c17f74855bb0c576294157fc23c2333aa9087cc054bce5f18662842e"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.617985 5033 generic.go:334] "Generic (PLEG): container finished" podID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerID="3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418" exitCode=0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.618162 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerDied","Data":"3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.618211 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerStarted","Data":"8dbf8fa3af2616e8a86b072d81a8cba2c773bb8524a0c88854a21fd504612496"} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.677695 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: E0319 18:58:36.680021 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:37.179999777 +0000 UTC m=+127.285029626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.685671 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.688248 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.696200 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.702957 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.722603 5033 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T18:58:36.389994478Z","Handler":null,"Name":""} Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.753193 5033 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.753226 5033 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.777252 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.789402 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.789506 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.789567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nkq\" (UniqueName: \"kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.790490 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.793398 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.793430 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.842414 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sl57l\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.856296 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.871433 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.883949 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert\") pod \"f123389a-beda-4156-bd02-56ccf2a479f1\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config\") pod \"f123389a-beda-4156-bd02-56ccf2a479f1\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles\") pod \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891588 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca\") pod \"f123389a-beda-4156-bd02-56ccf2a479f1\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891650 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpmsg\" (UniqueName: \"kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg\") pod \"f123389a-beda-4156-bd02-56ccf2a479f1\" (UID: \"f123389a-beda-4156-bd02-56ccf2a479f1\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891680 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca\") pod \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891713 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxhqs\" (UniqueName: \"kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs\") pod \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config\") pod \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert\") pod \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\" (UID: \"dabe1e76-f401-4d8c-99a4-36f7acc7e241\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.891940 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.892216 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.892289 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89nkq\" (UniqueName: \"kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.892323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.894113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.898916 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config" (OuterVolumeSpecName: "config") pod "dabe1e76-f401-4d8c-99a4-36f7acc7e241" (UID: "dabe1e76-f401-4d8c-99a4-36f7acc7e241"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.898954 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config" (OuterVolumeSpecName: "config") pod "f123389a-beda-4156-bd02-56ccf2a479f1" (UID: "f123389a-beda-4156-bd02-56ccf2a479f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.899023 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "f123389a-beda-4156-bd02-56ccf2a479f1" (UID: "f123389a-beda-4156-bd02-56ccf2a479f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.899484 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dabe1e76-f401-4d8c-99a4-36f7acc7e241" (UID: "dabe1e76-f401-4d8c-99a4-36f7acc7e241"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.900192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.900633 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f123389a-beda-4156-bd02-56ccf2a479f1" (UID: "f123389a-beda-4156-bd02-56ccf2a479f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.903014 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca" (OuterVolumeSpecName: "client-ca") pod "dabe1e76-f401-4d8c-99a4-36f7acc7e241" (UID: "dabe1e76-f401-4d8c-99a4-36f7acc7e241"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.905947 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg" (OuterVolumeSpecName: "kube-api-access-zpmsg") pod "f123389a-beda-4156-bd02-56ccf2a479f1" (UID: "f123389a-beda-4156-bd02-56ccf2a479f1"). InnerVolumeSpecName "kube-api-access-zpmsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.916854 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.916884 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nkq\" (UniqueName: \"kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq\") pod \"redhat-marketplace-6dvj7\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.926985 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dabe1e76-f401-4d8c-99a4-36f7acc7e241" (UID: "dabe1e76-f401-4d8c-99a4-36f7acc7e241"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.932683 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs" (OuterVolumeSpecName: "kube-api-access-mxhqs") pod "dabe1e76-f401-4d8c-99a4-36f7acc7e241" (UID: "dabe1e76-f401-4d8c-99a4-36f7acc7e241"). InnerVolumeSpecName "kube-api-access-mxhqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.971844 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995138 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f123389a-beda-4156-bd02-56ccf2a479f1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995181 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995193 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995205 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f123389a-beda-4156-bd02-56ccf2a479f1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995218 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpmsg\" (UniqueName: \"kubernetes.io/projected/f123389a-beda-4156-bd02-56ccf2a479f1-kube-api-access-zpmsg\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995230 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995240 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxhqs\" (UniqueName: \"kubernetes.io/projected/dabe1e76-f401-4d8c-99a4-36f7acc7e241-kube-api-access-mxhqs\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995251 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dabe1e76-f401-4d8c-99a4-36f7acc7e241-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:36 crc kubenswrapper[5033]: I0319 18:58:36.995260 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabe1e76-f401-4d8c-99a4-36f7acc7e241-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.027206 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068321 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:58:37 crc kubenswrapper[5033]: E0319 18:58:37.068567 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2c0014-3594-464c-a0e8-8e4e36301dc1" containerName="pruner" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068606 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2c0014-3594-464c-a0e8-8e4e36301dc1" containerName="pruner" Mar 19 18:58:37 crc kubenswrapper[5033]: E0319 18:58:37.068620 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f123389a-beda-4156-bd02-56ccf2a479f1" containerName="route-controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068627 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f123389a-beda-4156-bd02-56ccf2a479f1" containerName="route-controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: E0319 18:58:37.068634 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerName="controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068640 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerName="controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068726 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" containerName="controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068739 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2c0014-3594-464c-a0e8-8e4e36301dc1" containerName="pruner" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.068752 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f123389a-beda-4156-bd02-56ccf2a479f1" containerName="route-controller-manager" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.069388 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.079056 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.096400 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access\") pod \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.096535 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir\") pod \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\" (UID: \"6b2c0014-3594-464c-a0e8-8e4e36301dc1\") " Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.096783 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b2c0014-3594-464c-a0e8-8e4e36301dc1" (UID: "6b2c0014-3594-464c-a0e8-8e4e36301dc1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.104394 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b2c0014-3594-464c-a0e8-8e4e36301dc1" (UID: "6b2c0014-3594-464c-a0e8-8e4e36301dc1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.131792 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.169331 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.197402 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq48d\" (UniqueName: \"kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.197720 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.197849 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.197927 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.197944 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b2c0014-3594-464c-a0e8-8e4e36301dc1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:37 crc kubenswrapper[5033]: W0319 18:58:37.216896 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de9caaa_c912_48f2_9306_5cc7768fc8b3.slice/crio-edb4157b1d5b61ab0994b9a1a7b49b4942b9ec1d4f3ae87fe0a5a764ff12bf61 WatchSource:0}: Error finding container edb4157b1d5b61ab0994b9a1a7b49b4942b9ec1d4f3ae87fe0a5a764ff12bf61: Status 404 returned error can't find the container with id edb4157b1d5b61ab0994b9a1a7b49b4942b9ec1d4f3ae87fe0a5a764ff12bf61 Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.299066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.299409 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq48d\" (UniqueName: \"kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.299515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.299762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.300314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.301518 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.313283 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:37 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:37 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:37 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.313320 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.321005 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq48d\" (UniqueName: \"kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d\") pod \"redhat-marketplace-978qb\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: W0319 18:58:37.332087 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70dbfc25_0b94_4c8f_89bb_af0d0bb8e2e6.slice/crio-0a19cef964dbd07c8a4efe5430e10a09df493856959374e7b270972475ef49c3 WatchSource:0}: Error finding container 0a19cef964dbd07c8a4efe5430e10a09df493856959374e7b270972475ef49c3: Status 404 returned error can't find the container with id 0a19cef964dbd07c8a4efe5430e10a09df493856959374e7b270972475ef49c3 Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.384224 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.395939 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.396587 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.412099 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.506248 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.506420 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.506530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.506651 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpg64\" (UniqueName: \"kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.506739 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.608271 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.608723 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.608760 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.608785 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.608828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpg64\" (UniqueName: \"kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.613526 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.614111 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.619029 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.619528 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.630022 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpg64\" (UniqueName: \"kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64\") pod \"controller-manager-777c94ccf6-rjqvg\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.630477 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.630486 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fz4jq" event={"ID":"dabe1e76-f401-4d8c-99a4-36f7acc7e241","Type":"ContainerDied","Data":"790a3787fd177d9d40201cc965c57b45c8711dad22095e2add46de6c19a6dfab"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.630546 5033 scope.go:117] "RemoveContainer" containerID="9b7f72059c52fd7f7137968fb0f6dc6285a93d73ec410732ac3fe104eb540603" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.634865 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b2c0014-3594-464c-a0e8-8e4e36301dc1","Type":"ContainerDied","Data":"b553c07e4c33c11380185381b03bc048307516dcfae9f29c0b41dceff58a93a6"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.634898 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b553c07e4c33c11380185381b03bc048307516dcfae9f29c0b41dceff58a93a6" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.634961 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.641001 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" event={"ID":"15a23e16-4194-4773-a4e8-1c3515d31c5c","Type":"ContainerStarted","Data":"fc54961017cddf4691cf1507ce52aadce474f3944a261cf54f035763c4bd9c6e"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.645259 5033 generic.go:334] "Generic (PLEG): container finished" podID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerID="64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271" exitCode=0 Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.645357 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerDied","Data":"64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.645394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerStarted","Data":"0a19cef964dbd07c8a4efe5430e10a09df493856959374e7b270972475ef49c3"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.655413 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.655405 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c" event={"ID":"f123389a-beda-4156-bd02-56ccf2a479f1","Type":"ContainerDied","Data":"343181e6aa68c0a3865417bd7540a90c24b2c1293f767b5309e4c88f1c820598"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.668877 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-g9hmn" podStartSLOduration=16.668858378 podStartE2EDuration="16.668858378s" podCreationTimestamp="2026-03-19 18:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:37.665892837 +0000 UTC m=+127.770922676" watchObservedRunningTime="2026-03-19 18:58:37.668858378 +0000 UTC m=+127.773888227" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.671412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" event={"ID":"1de9caaa-c912-48f2-9306-5cc7768fc8b3","Type":"ContainerStarted","Data":"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.671473 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" event={"ID":"1de9caaa-c912-48f2-9306-5cc7768fc8b3","Type":"ContainerStarted","Data":"edb4157b1d5b61ab0994b9a1a7b49b4942b9ec1d4f3ae87fe0a5a764ff12bf61"} Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.671933 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.672650 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.673637 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.678092 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.695015 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.710309 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.710444 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqx4m\" (UniqueName: \"kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.710963 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.715953 5033 scope.go:117] "RemoveContainer" containerID="0fe1b544c17f74855bb0c576294157fc23c2333aa9087cc054bce5f18662842e" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.718872 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.732580 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" podStartSLOduration=63.732545871 podStartE2EDuration="1m3.732545871s" podCreationTimestamp="2026-03-19 18:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:37.710249964 +0000 UTC m=+127.815279823" watchObservedRunningTime="2026-03-19 18:58:37.732545871 +0000 UTC m=+127.837575720" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.747859 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.778843 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fz4jq"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.786096 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.793015 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h772c"] Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.813679 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqx4m\" (UniqueName: \"kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.814016 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.814148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.814645 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.814952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.834050 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqx4m\" (UniqueName: \"kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m\") pod \"redhat-operators-ddsbp\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:37 crc kubenswrapper[5033]: I0319 18:58:37.926996 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:58:37 crc kubenswrapper[5033]: W0319 18:58:37.936810 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38e7c13_214e_4636_83f2_bf0025afbec3.slice/crio-a46a70e04562e17b4925586f57e4720cc854af426dbcbd887701b7e023ec2ac7 WatchSource:0}: Error finding container a46a70e04562e17b4925586f57e4720cc854af426dbcbd887701b7e023ec2ac7: Status 404 returned error can't find the container with id a46a70e04562e17b4925586f57e4720cc854af426dbcbd887701b7e023ec2ac7 Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.016446 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.076727 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.077694 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.084865 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.117111 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znjv\" (UniqueName: \"kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.117149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.117203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.159948 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.219278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znjv\" (UniqueName: \"kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.219345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.219430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.220713 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.221074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.244004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znjv\" (UniqueName: \"kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv\") pod \"redhat-operators-n4zm4\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.313076 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:38 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:38 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:38 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.313874 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.339682 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.346462 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:38 crc kubenswrapper[5033]: W0319 18:58:38.354198 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e3da86_ceaf_47ef_81af_07853efd035b.slice/crio-77a58f0bf88a6a8bb03229679d5a3f475d35eea59dac432fd9ec639f6fa9ec3c WatchSource:0}: Error finding container 77a58f0bf88a6a8bb03229679d5a3f475d35eea59dac432fd9ec639f6fa9ec3c: Status 404 returned error can't find the container with id 77a58f0bf88a6a8bb03229679d5a3f475d35eea59dac432fd9ec639f6fa9ec3c Mar 19 18:58:38 crc kubenswrapper[5033]: W0319 18:58:38.357928 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b43f7c8_f653_4846_a873_be457dd55f8d.slice/crio-5f841f5c80a63f88b919a5ad339ef59742d89fe6c8e13e155a817901ffd3dc57 WatchSource:0}: Error finding container 5f841f5c80a63f88b919a5ad339ef59742d89fe6c8e13e155a817901ffd3dc57: Status 404 returned error can't find the container with id 5f841f5c80a63f88b919a5ad339ef59742d89fe6c8e13e155a817901ffd3dc57 Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.396881 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.397983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401434 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401673 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401687 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401730 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401458 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.401986 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.408423 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.421994 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.426752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.426817 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.426856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74kj\" (UniqueName: \"kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.426881 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.528245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.528505 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.529376 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.529819 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.529883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74kj\" (UniqueName: \"kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.530405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.537955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.546370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74kj\" (UniqueName: \"kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj\") pod \"route-controller-manager-644cf44bdd-hrwnj\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.596420 5033 ???:1] "http: TLS handshake error from 192.168.126.11:50548: no serving certificate available for the kubelet" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.631477 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.632691 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabe1e76-f401-4d8c-99a4-36f7acc7e241" path="/var/lib/kubelet/pods/dabe1e76-f401-4d8c-99a4-36f7acc7e241/volumes" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.633394 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f123389a-beda-4156-bd02-56ccf2a479f1" path="/var/lib/kubelet/pods/f123389a-beda-4156-bd02-56ccf2a479f1/volumes" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.679957 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerStarted","Data":"77a58f0bf88a6a8bb03229679d5a3f475d35eea59dac432fd9ec639f6fa9ec3c"} Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.682798 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" event={"ID":"1b43f7c8-f653-4846-a873-be457dd55f8d","Type":"ContainerStarted","Data":"5f841f5c80a63f88b919a5ad339ef59742d89fe6c8e13e155a817901ffd3dc57"} Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.687190 5033 generic.go:334] "Generic (PLEG): container finished" podID="73127891-1d5d-4371-87c0-82245ab12d5d" containerID="d3c47cf3115467b6551c8ae8875310d6b60d9e4a0b4458cbfca0f09e4cadc039" exitCode=0 Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.687246 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" event={"ID":"73127891-1d5d-4371-87c0-82245ab12d5d","Type":"ContainerDied","Data":"d3c47cf3115467b6551c8ae8875310d6b60d9e4a0b4458cbfca0f09e4cadc039"} Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.697030 5033 generic.go:334] "Generic (PLEG): container finished" podID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerID="a37fad93b3e3d35e755f68ca6cbcc50647025f6e1a9eea7d297f74a3fee8df1e" exitCode=0 Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.697092 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerDied","Data":"a37fad93b3e3d35e755f68ca6cbcc50647025f6e1a9eea7d297f74a3fee8df1e"} Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.697140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerStarted","Data":"a46a70e04562e17b4925586f57e4720cc854af426dbcbd887701b7e023ec2ac7"} Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.738845 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:38 crc kubenswrapper[5033]: I0319 18:58:38.929569 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:58:39 crc kubenswrapper[5033]: W0319 18:58:39.001692 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cedfa0d_8527_4d20_9326_88bf40011456.slice/crio-1a77c8ece250db43ba830fa8ead1a2ab8e6ede54c97c58eb73cdb4f33364d59c WatchSource:0}: Error finding container 1a77c8ece250db43ba830fa8ead1a2ab8e6ede54c97c58eb73cdb4f33364d59c: Status 404 returned error can't find the container with id 1a77c8ece250db43ba830fa8ead1a2ab8e6ede54c97c58eb73cdb4f33364d59c Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.022742 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.031966 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2rhhh" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.128129 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cz22k" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.226777 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.227691 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.238494 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.244125 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.244321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.265412 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.265708 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.321510 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:39 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:39 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:39 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.321556 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.335760 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.367131 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.367179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.367326 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.394038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: W0319 18:58:39.400772 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22e8fea_1efc_4ccc_9be6_de3e177717a8.slice/crio-02c1c0e4a6df086b18bd22d903259ee43f70b3d7fc0bd69ec9caa13db7833d1a WatchSource:0}: Error finding container 02c1c0e4a6df086b18bd22d903259ee43f70b3d7fc0bd69ec9caa13db7833d1a: Status 404 returned error can't find the container with id 02c1c0e4a6df086b18bd22d903259ee43f70b3d7fc0bd69ec9caa13db7833d1a Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.567039 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.710792 5033 generic.go:334] "Generic (PLEG): container finished" podID="63e3da86-ceaf-47ef-81af-07853efd035b" containerID="063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2" exitCode=0 Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.711146 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerDied","Data":"063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.721871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" event={"ID":"1b43f7c8-f653-4846-a873-be457dd55f8d","Type":"ContainerStarted","Data":"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.722233 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" podUID="1b43f7c8-f653-4846-a873-be457dd55f8d" containerName="controller-manager" containerID="cri-o://64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f" gracePeriod=30 Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.722925 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.736312 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.747298 5033 generic.go:334] "Generic (PLEG): container finished" podID="0cedfa0d-8527-4d20-9326-88bf40011456" containerID="1c2a2a15063ab56e9ecaffba6e97c49043b42564f4681d0477cc98b52b4be8ab" exitCode=0 Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.747403 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerDied","Data":"1c2a2a15063ab56e9ecaffba6e97c49043b42564f4681d0477cc98b52b4be8ab"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.747445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerStarted","Data":"1a77c8ece250db43ba830fa8ead1a2ab8e6ede54c97c58eb73cdb4f33364d59c"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.759005 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" event={"ID":"a22e8fea-1efc-4ccc-9be6-de3e177717a8","Type":"ContainerStarted","Data":"80ee30a854d53fef7ddc655e9d1a24470ffcd9ee13a788dd2b4d7b9bcf603b4a"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.759067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" event={"ID":"a22e8fea-1efc-4ccc-9be6-de3e177717a8","Type":"ContainerStarted","Data":"02c1c0e4a6df086b18bd22d903259ee43f70b3d7fc0bd69ec9caa13db7833d1a"} Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.759090 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.802512 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" podStartSLOduration=3.802472626 podStartE2EDuration="3.802472626s" podCreationTimestamp="2026-03-19 18:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:39.765875543 +0000 UTC m=+129.870905392" watchObservedRunningTime="2026-03-19 18:58:39.802472626 +0000 UTC m=+129.907502505" Mar 19 18:58:39 crc kubenswrapper[5033]: I0319 18:58:39.869106 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" podStartSLOduration=3.869085398 podStartE2EDuration="3.869085398s" podCreationTimestamp="2026-03-19 18:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:39.845847869 +0000 UTC m=+129.950877748" watchObservedRunningTime="2026-03-19 18:58:39.869085398 +0000 UTC m=+129.974115247" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.030496 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-x9vcw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.030512 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-x9vcw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.030554 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x9vcw" podUID="ac25d35e-036b-4d7f-a9e8-2417681f2a1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.030570 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x9vcw" podUID="ac25d35e-036b-4d7f-a9e8-2417681f2a1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 19 18:58:40 crc kubenswrapper[5033]: E0319 18:58:40.059254 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:40 crc kubenswrapper[5033]: E0319 18:58:40.075007 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:40 crc kubenswrapper[5033]: E0319 18:58:40.079627 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:40 crc kubenswrapper[5033]: E0319 18:58:40.079756 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.082442 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.085159 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.085201 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.087722 5033 patch_prober.go:28] interesting pod/console-f9d7485db-5jw55 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.087762 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5jw55" podUID="9837c3e1-e614-408e-8914-c1390367407f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.117637 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dq52v" Mar 19 18:58:40 crc kubenswrapper[5033]: W0319 18:58:40.118268 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddcf3ecaf_4f0e_4f5b_b1eb_18a120610c9f.slice/crio-891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26 WatchSource:0}: Error finding container 891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26: Status 404 returned error can't find the container with id 891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26 Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.150929 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.309207 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.312305 5033 patch_prober.go:28] interesting pod/router-default-5444994796-hb9f7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:40 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Mar 19 18:58:40 crc kubenswrapper[5033]: [+]process-running ok Mar 19 18:58:40 crc kubenswrapper[5033]: healthz check failed Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.312382 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hb9f7" podUID="4ee36317-5c1d-4ef5-a81f-c7d96e18e506" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.318776 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.324085 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.499902 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fct9g\" (UniqueName: \"kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g\") pod \"73127891-1d5d-4371-87c0-82245ab12d5d\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.499986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpg64\" (UniqueName: \"kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64\") pod \"1b43f7c8-f653-4846-a873-be457dd55f8d\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume\") pod \"73127891-1d5d-4371-87c0-82245ab12d5d\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert\") pod \"1b43f7c8-f653-4846-a873-be457dd55f8d\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500075 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca\") pod \"1b43f7c8-f653-4846-a873-be457dd55f8d\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500104 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume\") pod \"73127891-1d5d-4371-87c0-82245ab12d5d\" (UID: \"73127891-1d5d-4371-87c0-82245ab12d5d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500130 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config\") pod \"1b43f7c8-f653-4846-a873-be457dd55f8d\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500146 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles\") pod \"1b43f7c8-f653-4846-a873-be457dd55f8d\" (UID: \"1b43f7c8-f653-4846-a873-be457dd55f8d\") " Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.500999 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume" (OuterVolumeSpecName: "config-volume") pod "73127891-1d5d-4371-87c0-82245ab12d5d" (UID: "73127891-1d5d-4371-87c0-82245ab12d5d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.501264 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b43f7c8-f653-4846-a873-be457dd55f8d" (UID: "1b43f7c8-f653-4846-a873-be457dd55f8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.501774 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config" (OuterVolumeSpecName: "config") pod "1b43f7c8-f653-4846-a873-be457dd55f8d" (UID: "1b43f7c8-f653-4846-a873-be457dd55f8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.502151 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b43f7c8-f653-4846-a873-be457dd55f8d" (UID: "1b43f7c8-f653-4846-a873-be457dd55f8d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.507630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b43f7c8-f653-4846-a873-be457dd55f8d" (UID: "1b43f7c8-f653-4846-a873-be457dd55f8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.507747 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64" (OuterVolumeSpecName: "kube-api-access-dpg64") pod "1b43f7c8-f653-4846-a873-be457dd55f8d" (UID: "1b43f7c8-f653-4846-a873-be457dd55f8d"). InnerVolumeSpecName "kube-api-access-dpg64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.508204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73127891-1d5d-4371-87c0-82245ab12d5d" (UID: "73127891-1d5d-4371-87c0-82245ab12d5d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.508748 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g" (OuterVolumeSpecName: "kube-api-access-fct9g") pod "73127891-1d5d-4371-87c0-82245ab12d5d" (UID: "73127891-1d5d-4371-87c0-82245ab12d5d"). InnerVolumeSpecName "kube-api-access-fct9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.601987 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fct9g\" (UniqueName: \"kubernetes.io/projected/73127891-1d5d-4371-87c0-82245ab12d5d-kube-api-access-fct9g\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602019 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpg64\" (UniqueName: \"kubernetes.io/projected/1b43f7c8-f653-4846-a873-be457dd55f8d-kube-api-access-dpg64\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602029 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73127891-1d5d-4371-87c0-82245ab12d5d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602038 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b43f7c8-f653-4846-a873-be457dd55f8d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602047 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602055 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73127891-1d5d-4371-87c0-82245ab12d5d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602063 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.602071 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b43f7c8-f653-4846-a873-be457dd55f8d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.837151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f","Type":"ContainerStarted","Data":"891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26"} Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.848205 5033 generic.go:334] "Generic (PLEG): container finished" podID="1b43f7c8-f653-4846-a873-be457dd55f8d" containerID="64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f" exitCode=0 Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.848262 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" event={"ID":"1b43f7c8-f653-4846-a873-be457dd55f8d","Type":"ContainerDied","Data":"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f"} Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.848288 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" event={"ID":"1b43f7c8-f653-4846-a873-be457dd55f8d","Type":"ContainerDied","Data":"5f841f5c80a63f88b919a5ad339ef59742d89fe6c8e13e155a817901ffd3dc57"} Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.848303 5033 scope.go:117] "RemoveContainer" containerID="64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.848413 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777c94ccf6-rjqvg" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.855520 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" event={"ID":"73127891-1d5d-4371-87c0-82245ab12d5d","Type":"ContainerDied","Data":"fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f"} Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.855552 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.855573 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa7ff3f25e25b4104c87d4e8d4270f0fba847dddca6275c9a3fe3b3ff0dc736f" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.866510 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.871368 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-777c94ccf6-rjqvg"] Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.873993 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.896138 5033 scope.go:117] "RemoveContainer" containerID="64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f" Mar 19 18:58:40 crc kubenswrapper[5033]: E0319 18:58:40.899232 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f\": container with ID starting with 64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f not found: ID does not exist" containerID="64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f" Mar 19 18:58:40 crc kubenswrapper[5033]: I0319 18:58:40.899278 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f"} err="failed to get container status \"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f\": rpc error: code = NotFound desc = could not find container \"64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f\": container with ID starting with 64ee1f0400221bce478bf6d1ebab0a366d24006390a9c4f3a6dce15aec634b8f not found: ID does not exist" Mar 19 18:58:41 crc kubenswrapper[5033]: I0319 18:58:41.313481 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:41 crc kubenswrapper[5033]: I0319 18:58:41.317837 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hb9f7" Mar 19 18:58:41 crc kubenswrapper[5033]: I0319 18:58:41.879441 5033 generic.go:334] "Generic (PLEG): container finished" podID="dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" containerID="402b46e4b7b70b4f71f94556313df3c410a2b800d880310e9283ffe251f5334d" exitCode=0 Mar 19 18:58:41 crc kubenswrapper[5033]: I0319 18:58:41.879549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f","Type":"ContainerDied","Data":"402b46e4b7b70b4f71f94556313df3c410a2b800d880310e9283ffe251f5334d"} Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.408805 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:58:42 crc kubenswrapper[5033]: E0319 18:58:42.411346 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73127891-1d5d-4371-87c0-82245ab12d5d" containerName="collect-profiles" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.411540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73127891-1d5d-4371-87c0-82245ab12d5d" containerName="collect-profiles" Mar 19 18:58:42 crc kubenswrapper[5033]: E0319 18:58:42.411635 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b43f7c8-f653-4846-a873-be457dd55f8d" containerName="controller-manager" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.411857 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b43f7c8-f653-4846-a873-be457dd55f8d" containerName="controller-manager" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.412185 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b43f7c8-f653-4846-a873-be457dd55f8d" containerName="controller-manager" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.413142 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73127891-1d5d-4371-87c0-82245ab12d5d" containerName="collect-profiles" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.413907 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.424169 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.425439 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.425649 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.425870 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.425977 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.426615 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.427934 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.431961 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.540214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.540265 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.540320 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.540359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.540388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86zn\" (UniqueName: \"kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.633608 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b43f7c8-f653-4846-a873-be457dd55f8d" path="/var/lib/kubelet/pods/1b43f7c8-f653-4846-a873-be457dd55f8d/volumes" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.641521 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.641576 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86zn\" (UniqueName: \"kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.641600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.641622 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.641662 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.642653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.643173 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.644160 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.647287 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.664288 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86zn\" (UniqueName: \"kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn\") pod \"controller-manager-79d6bccb64-jtx4r\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:42 crc kubenswrapper[5033]: I0319 18:58:42.791069 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.231004 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:58:43 crc kubenswrapper[5033]: W0319 18:58:43.240443 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61959148_db52_4466_b6ee_497bd961721c.slice/crio-1f71d294621313af101d6d675f861e7811887ac53f77386472de7185fe0b6902 WatchSource:0}: Error finding container 1f71d294621313af101d6d675f861e7811887ac53f77386472de7185fe0b6902: Status 404 returned error can't find the container with id 1f71d294621313af101d6d675f861e7811887ac53f77386472de7185fe0b6902 Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.247494 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.256328 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir\") pod \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.256811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" (UID: "dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.257073 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access\") pod \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\" (UID: \"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f\") " Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.257918 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.269388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" (UID: "dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.358725 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.744924 5033 ???:1] "http: TLS handshake error from 192.168.126.11:50554: no serving certificate available for the kubelet" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.974582 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f","Type":"ContainerDied","Data":"891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26"} Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.974620 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="891a8f914d18b4b2cc0a9c8f604489165452fcc44ec8bbc3f4b83389bfbf8d26" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.974673 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.998259 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" event={"ID":"61959148-db52-4466-b6ee-497bd961721c","Type":"ContainerStarted","Data":"c7330de4f7a33def511ecabd40bdcb3bef862766e7e56558c2af2921c4558894"} Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.998327 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" event={"ID":"61959148-db52-4466-b6ee-497bd961721c","Type":"ContainerStarted","Data":"1f71d294621313af101d6d675f861e7811887ac53f77386472de7185fe0b6902"} Mar 19 18:58:43 crc kubenswrapper[5033]: I0319 18:58:43.999079 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:44 crc kubenswrapper[5033]: I0319 18:58:44.011155 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:58:44 crc kubenswrapper[5033]: I0319 18:58:44.031316 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" podStartSLOduration=6.031289386 podStartE2EDuration="6.031289386s" podCreationTimestamp="2026-03-19 18:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:44.026890336 +0000 UTC m=+134.131920185" watchObservedRunningTime="2026-03-19 18:58:44.031289386 +0000 UTC m=+134.136319245" Mar 19 18:58:45 crc kubenswrapper[5033]: I0319 18:58:45.048184 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fvwzx" Mar 19 18:58:45 crc kubenswrapper[5033]: I0319 18:58:45.816344 5033 ???:1] "http: TLS handshake error from 192.168.126.11:50558: no serving certificate available for the kubelet" Mar 19 18:58:50 crc kubenswrapper[5033]: I0319 18:58:50.047474 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x9vcw" Mar 19 18:58:50 crc kubenswrapper[5033]: E0319 18:58:50.056079 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:50 crc kubenswrapper[5033]: E0319 18:58:50.058046 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:50 crc kubenswrapper[5033]: E0319 18:58:50.059264 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:50 crc kubenswrapper[5033]: E0319 18:58:50.059315 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:50 crc kubenswrapper[5033]: I0319 18:58:50.104898 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:50 crc kubenswrapper[5033]: I0319 18:58:50.109875 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 18:58:54 crc kubenswrapper[5033]: I0319 18:58:54.029140 5033 ???:1] "http: TLS handshake error from 192.168.126.11:38744: no serving certificate available for the kubelet" Mar 19 18:58:55 crc kubenswrapper[5033]: I0319 18:58:55.937294 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:58:55 crc kubenswrapper[5033]: I0319 18:58:55.937620 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" containerID="cri-o://c7330de4f7a33def511ecabd40bdcb3bef862766e7e56558c2af2921c4558894" gracePeriod=30 Mar 19 18:58:56 crc kubenswrapper[5033]: I0319 18:58:56.018533 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:58:56 crc kubenswrapper[5033]: I0319 18:58:56.018962 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" containerID="cri-o://80ee30a854d53fef7ddc655e9d1a24470ffcd9ee13a788dd2b4d7b9bcf603b4a" gracePeriod=30 Mar 19 18:58:56 crc kubenswrapper[5033]: I0319 18:58:56.891604 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 18:58:58 crc kubenswrapper[5033]: I0319 18:58:58.098601 5033 generic.go:334] "Generic (PLEG): container finished" podID="61959148-db52-4466-b6ee-497bd961721c" containerID="c7330de4f7a33def511ecabd40bdcb3bef862766e7e56558c2af2921c4558894" exitCode=0 Mar 19 18:58:58 crc kubenswrapper[5033]: I0319 18:58:58.098892 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" event={"ID":"61959148-db52-4466-b6ee-497bd961721c","Type":"ContainerDied","Data":"c7330de4f7a33def511ecabd40bdcb3bef862766e7e56558c2af2921c4558894"} Mar 19 18:58:58 crc kubenswrapper[5033]: I0319 18:58:58.739396 5033 patch_prober.go:28] interesting pod/route-controller-manager-644cf44bdd-hrwnj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 19 18:58:58 crc kubenswrapper[5033]: I0319 18:58:58.739810 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 19 18:58:59 crc kubenswrapper[5033]: I0319 18:58:59.107717 5033 generic.go:334] "Generic (PLEG): container finished" podID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerID="80ee30a854d53fef7ddc655e9d1a24470ffcd9ee13a788dd2b4d7b9bcf603b4a" exitCode=0 Mar 19 18:58:59 crc kubenswrapper[5033]: I0319 18:58:59.107764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" event={"ID":"a22e8fea-1efc-4ccc-9be6-de3e177717a8","Type":"ContainerDied","Data":"80ee30a854d53fef7ddc655e9d1a24470ffcd9ee13a788dd2b4d7b9bcf603b4a"} Mar 19 18:59:00 crc kubenswrapper[5033]: E0319 18:59:00.056104 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:00 crc kubenswrapper[5033]: E0319 18:59:00.057690 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:00 crc kubenswrapper[5033]: E0319 18:59:00.059199 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:00 crc kubenswrapper[5033]: E0319 18:59:00.059232 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:00 crc kubenswrapper[5033]: I0319 18:59:00.637102 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 18:59:02 crc kubenswrapper[5033]: I0319 18:59:02.792084 5033 patch_prober.go:28] interesting pod/controller-manager-79d6bccb64-jtx4r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 19 18:59:02 crc kubenswrapper[5033]: I0319 18:59:02.792438 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 19 18:59:04 crc kubenswrapper[5033]: E0319 18:59:04.570430 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 18:59:04 crc kubenswrapper[5033]: E0319 18:59:04.571166 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:59:04 crc kubenswrapper[5033]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 18:59:04 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pxr5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565778-gls8c_openshift-infra(05cd9325-9740-4a70-98a7-3de9ebb30035): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 18:59:04 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 18:59:04 crc kubenswrapper[5033]: E0319 18:59:04.572497 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565778-gls8c" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" Mar 19 18:59:05 crc kubenswrapper[5033]: I0319 18:59:05.138810 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-ntlnh_0fb8fbf8-b29b-4b07-a598-915a2c65affa/kube-multus-additional-cni-plugins/0.log" Mar 19 18:59:05 crc kubenswrapper[5033]: I0319 18:59:05.139309 5033 generic.go:334] "Generic (PLEG): container finished" podID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" exitCode=137 Mar 19 18:59:05 crc kubenswrapper[5033]: I0319 18:59:05.139405 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" event={"ID":"0fb8fbf8-b29b-4b07-a598-915a2c65affa","Type":"ContainerDied","Data":"9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf"} Mar 19 18:59:05 crc kubenswrapper[5033]: E0319 18:59:05.141786 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565778-gls8c" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" Mar 19 18:59:05 crc kubenswrapper[5033]: I0319 18:59:05.167791 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.167773943 podStartE2EDuration="5.167773943s" podCreationTimestamp="2026-03-19 18:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:05.163288779 +0000 UTC m=+155.268318658" watchObservedRunningTime="2026-03-19 18:59:05.167773943 +0000 UTC m=+155.272803792" Mar 19 18:59:06 crc kubenswrapper[5033]: E0319 18:59:06.220002 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 18:59:06 crc kubenswrapper[5033]: E0319 18:59:06.220590 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnfqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hhm4g_openshift-marketplace(ab53fd0b-8294-4dd0-a434-ab7eaff0e360): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:06 crc kubenswrapper[5033]: E0319 18:59:06.221927 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hhm4g" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.555320 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:59:06 crc kubenswrapper[5033]: E0319 18:59:06.555550 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" containerName="pruner" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.555564 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" containerName="pruner" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.555710 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf3ecaf-4f0e-4f5b-b1eb-18a120610c9f" containerName="pruner" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.556039 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.557961 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.558764 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.562638 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.615098 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.615151 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.716187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.716235 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.716305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.748996 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[5033]: I0319 18:59:06.928864 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:07 crc kubenswrapper[5033]: E0319 18:59:07.612635 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hhm4g" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" Mar 19 18:59:08 crc kubenswrapper[5033]: I0319 18:59:08.631842 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 18:59:09 crc kubenswrapper[5033]: I0319 18:59:09.740260 5033 patch_prober.go:28] interesting pod/route-controller-manager-644cf44bdd-hrwnj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:59:09 crc kubenswrapper[5033]: I0319 18:59:09.740763 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:59:09 crc kubenswrapper[5033]: I0319 18:59:09.879279 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:59:09 crc kubenswrapper[5033]: I0319 18:59:09.889056 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ds9rw" Mar 19 18:59:09 crc kubenswrapper[5033]: I0319 18:59:09.895958 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.895943054 podStartE2EDuration="1.895943054s" podCreationTimestamp="2026-03-19 18:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:09.894407335 +0000 UTC m=+159.999437194" watchObservedRunningTime="2026-03-19 18:59:09.895943054 +0000 UTC m=+160.000972903" Mar 19 18:59:10 crc kubenswrapper[5033]: E0319 18:59:10.054853 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf is running failed: container process not found" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:10 crc kubenswrapper[5033]: E0319 18:59:10.055224 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf is running failed: container process not found" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:10 crc kubenswrapper[5033]: E0319 18:59:10.055493 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf is running failed: container process not found" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:10 crc kubenswrapper[5033]: E0319 18:59:10.055524 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.570054 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.572391 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.573219 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.687588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.687652 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.687908 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.789358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.789473 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.789494 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.789534 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.789572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.809870 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access\") pod \"installer-9-crc\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:11 crc kubenswrapper[5033]: I0319 18:59:11.903686 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.841513 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.841681 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6znjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n4zm4_openshift-marketplace(0cedfa0d-8527-4d20-9326-88bf40011456): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.843430 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n4zm4" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.902892 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.920402 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.922517 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.922645 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqx4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ddsbp_openshift-marketplace(63e3da86-ceaf-47ef-81af-07853efd035b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.923807 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ddsbp" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.925374 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-ntlnh_0fb8fbf8-b29b-4b07-a598-915a2c65affa/kube-multus-additional-cni-plugins/0.log" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.925536 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.965648 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.965865 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.965880 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.965894 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.965901 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: E0319 18:59:12.965911 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.965917 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.966009 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" containerName="route-controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.966021 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.966034 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.966444 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:12 crc kubenswrapper[5033]: I0319 18:59:12.971813 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.007920 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready\") pod \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.007980 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca\") pod \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008015 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config\") pod \"61959148-db52-4466-b6ee-497bd961721c\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008065 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert\") pod \"61959148-db52-4466-b6ee-497bd961721c\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir\") pod \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008114 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbq46\" (UniqueName: \"kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46\") pod \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008140 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles\") pod \"61959148-db52-4466-b6ee-497bd961721c\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008172 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist\") pod \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\" (UID: \"0fb8fbf8-b29b-4b07-a598-915a2c65affa\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008196 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca\") pod \"61959148-db52-4466-b6ee-497bd961721c\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008220 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config\") pod \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008240 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c86zn\" (UniqueName: \"kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn\") pod \"61959148-db52-4466-b6ee-497bd961721c\" (UID: \"61959148-db52-4466-b6ee-497bd961721c\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008266 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert\") pod \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008294 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g74kj\" (UniqueName: \"kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj\") pod \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\" (UID: \"a22e8fea-1efc-4ccc-9be6-de3e177717a8\") " Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008413 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl99j\" (UniqueName: \"kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008440 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008496 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008529 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.008555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.010330 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "61959148-db52-4466-b6ee-497bd961721c" (UID: "61959148-db52-4466-b6ee-497bd961721c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.010534 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "0fb8fbf8-b29b-4b07-a598-915a2c65affa" (UID: "0fb8fbf8-b29b-4b07-a598-915a2c65affa"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.010482 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config" (OuterVolumeSpecName: "config") pod "61959148-db52-4466-b6ee-497bd961721c" (UID: "61959148-db52-4466-b6ee-497bd961721c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.010746 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "0fb8fbf8-b29b-4b07-a598-915a2c65affa" (UID: "0fb8fbf8-b29b-4b07-a598-915a2c65affa"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.011008 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready" (OuterVolumeSpecName: "ready") pod "0fb8fbf8-b29b-4b07-a598-915a2c65affa" (UID: "0fb8fbf8-b29b-4b07-a598-915a2c65affa"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.011517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca" (OuterVolumeSpecName: "client-ca") pod "61959148-db52-4466-b6ee-497bd961721c" (UID: "61959148-db52-4466-b6ee-497bd961721c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.011746 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "a22e8fea-1efc-4ccc-9be6-de3e177717a8" (UID: "a22e8fea-1efc-4ccc-9be6-de3e177717a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.011958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config" (OuterVolumeSpecName: "config") pod "a22e8fea-1efc-4ccc-9be6-de3e177717a8" (UID: "a22e8fea-1efc-4ccc-9be6-de3e177717a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.016275 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn" (OuterVolumeSpecName: "kube-api-access-c86zn") pod "61959148-db52-4466-b6ee-497bd961721c" (UID: "61959148-db52-4466-b6ee-497bd961721c"). InnerVolumeSpecName "kube-api-access-c86zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.020755 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46" (OuterVolumeSpecName: "kube-api-access-bbq46") pod "0fb8fbf8-b29b-4b07-a598-915a2c65affa" (UID: "0fb8fbf8-b29b-4b07-a598-915a2c65affa"). InnerVolumeSpecName "kube-api-access-bbq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.020917 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a22e8fea-1efc-4ccc-9be6-de3e177717a8" (UID: "a22e8fea-1efc-4ccc-9be6-de3e177717a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.021791 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61959148-db52-4466-b6ee-497bd961721c" (UID: "61959148-db52-4466-b6ee-497bd961721c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.029431 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj" (OuterVolumeSpecName: "kube-api-access-g74kj") pod "a22e8fea-1efc-4ccc-9be6-de3e177717a8" (UID: "a22e8fea-1efc-4ccc-9be6-de3e177717a8"). InnerVolumeSpecName "kube-api-access-g74kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.114561 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.114799 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.114875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl99j\" (UniqueName: \"kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.114912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.114964 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115026 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g74kj\" (UniqueName: \"kubernetes.io/projected/a22e8fea-1efc-4ccc-9be6-de3e177717a8-kube-api-access-g74kj\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115043 5033 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/0fb8fbf8-b29b-4b07-a598-915a2c65affa-ready\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115052 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115063 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115074 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61959148-db52-4466-b6ee-497bd961721c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115087 5033 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fb8fbf8-b29b-4b07-a598-915a2c65affa-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115096 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbq46\" (UniqueName: \"kubernetes.io/projected/0fb8fbf8-b29b-4b07-a598-915a2c65affa-kube-api-access-bbq46\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115104 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115114 5033 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fb8fbf8-b29b-4b07-a598-915a2c65affa-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115125 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61959148-db52-4466-b6ee-497bd961721c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115134 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22e8fea-1efc-4ccc-9be6-de3e177717a8-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115144 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c86zn\" (UniqueName: \"kubernetes.io/projected/61959148-db52-4466-b6ee-497bd961721c-kube-api-access-c86zn\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.115155 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22e8fea-1efc-4ccc-9be6-de3e177717a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.118348 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.120236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.122981 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.123591 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.137514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl99j\" (UniqueName: \"kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j\") pod \"controller-manager-796bc79b88-58snb\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.184984 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerStarted","Data":"88581cb38f9b611578e93901df6bcc7b34d5764ca56acc9d9ebd9c99317b9080"} Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.186678 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerStarted","Data":"a1349a7c32b6fe2aa5c614b72f3bdb7a179e7f2b96361e7bb0a8bfcb6e63919f"} Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.193719 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-ntlnh_0fb8fbf8-b29b-4b07-a598-915a2c65affa/kube-multus-additional-cni-plugins/0.log" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.193823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" event={"ID":"0fb8fbf8-b29b-4b07-a598-915a2c65affa","Type":"ContainerDied","Data":"7827a311424867be68db5eed4736d9b251f516108bfa5d2cbe59a73184da0a66"} Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.193881 5033 scope.go:117] "RemoveContainer" containerID="9673e1498a1739b4ac84d0cddb28422342b6d1423af5001e4e13b75cb4d795cf" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.194032 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-ntlnh" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.217752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" event={"ID":"a22e8fea-1efc-4ccc-9be6-de3e177717a8","Type":"ContainerDied","Data":"02c1c0e4a6df086b18bd22d903259ee43f70b3d7fc0bd69ec9caa13db7833d1a"} Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.217887 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.233583 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-ntlnh"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.234353 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" event={"ID":"61959148-db52-4466-b6ee-497bd961721c","Type":"ContainerDied","Data":"1f71d294621313af101d6d675f861e7811887ac53f77386472de7185fe0b6902"} Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.234882 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.236713 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-ntlnh"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.246665 5033 scope.go:117] "RemoveContainer" containerID="80ee30a854d53fef7ddc655e9d1a24470ffcd9ee13a788dd2b4d7b9bcf603b4a" Mar 19 18:59:13 crc kubenswrapper[5033]: E0319 18:59:13.246782 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ddsbp" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" Mar 19 18:59:13 crc kubenswrapper[5033]: E0319 18:59:13.246800 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n4zm4" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.255846 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.286045 5033 scope.go:117] "RemoveContainer" containerID="c7330de4f7a33def511ecabd40bdcb3bef862766e7e56558c2af2921c4558894" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.294538 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.304514 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644cf44bdd-hrwnj"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.310275 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.312659 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79d6bccb64-jtx4r"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.342131 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.375421 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.588907 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.792113 5033 patch_prober.go:28] interesting pod/controller-manager-79d6bccb64-jtx4r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:59:13 crc kubenswrapper[5033]: I0319 18:59:13.792628 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79d6bccb64-jtx4r" podUID="61959148-db52-4466-b6ee-497bd961721c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.243672 5033 generic.go:334] "Generic (PLEG): container finished" podID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerID="fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.243772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerDied","Data":"fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.248316 5033 generic.go:334] "Generic (PLEG): container finished" podID="86345a30-6cc0-4359-8051-e508c71833f4" containerID="88581cb38f9b611578e93901df6bcc7b34d5764ca56acc9d9ebd9c99317b9080" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.248417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerDied","Data":"88581cb38f9b611578e93901df6bcc7b34d5764ca56acc9d9ebd9c99317b9080"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.252042 5033 generic.go:334] "Generic (PLEG): container finished" podID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerID="7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.252160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerDied","Data":"7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.257311 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" event={"ID":"9bb4fddb-4f2b-4dd9-977b-4ee590220f79","Type":"ContainerStarted","Data":"1fa7af0a539c4c5d86350f2f168522d69bff04645125d85f7e5ab9c65033935f"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.257363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" event={"ID":"9bb4fddb-4f2b-4dd9-977b-4ee590220f79","Type":"ContainerStarted","Data":"2088c2ff3fba918a9637e0f3daddbdbc6d1439b6746f391653b2513fcffdc30f"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.257715 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.264159 5033 generic.go:334] "Generic (PLEG): container finished" podID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerID="3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.264240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerDied","Data":"3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.266484 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.268573 5033 generic.go:334] "Generic (PLEG): container finished" podID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerID="a1349a7c32b6fe2aa5c614b72f3bdb7a179e7f2b96361e7bb0a8bfcb6e63919f" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.268618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerDied","Data":"a1349a7c32b6fe2aa5c614b72f3bdb7a179e7f2b96361e7bb0a8bfcb6e63919f"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.270777 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"67290f23-7880-41c5-a90f-575154eb45da","Type":"ContainerStarted","Data":"6ca7a60b981e237d69450d2a226f6a246f9bcb0ac59c935bec082153e810a336"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.270817 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"67290f23-7880-41c5-a90f-575154eb45da","Type":"ContainerStarted","Data":"46e782e467969bbfde52d3b5a989909f9beead346dc2bc4fc0eca426e4c5948f"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.275628 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bcc92f72-8f71-4a97-9ab2-560a092a29a0","Type":"ContainerStarted","Data":"754f0d75443428f1a5982aa03dc34ff53242a7e3837e6066547a14ef1b22effd"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.275693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bcc92f72-8f71-4a97-9ab2-560a092a29a0","Type":"ContainerStarted","Data":"328edc4dd448bf24aa6baafb505c93af15cc22badfe250f9cd54364c5d8061e6"} Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.304566 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=8.304540297 podStartE2EDuration="8.304540297s" podCreationTimestamp="2026-03-19 18:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:14.298508953 +0000 UTC m=+164.403538802" watchObservedRunningTime="2026-03-19 18:59:14.304540297 +0000 UTC m=+164.409570146" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.382552 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" podStartSLOduration=19.382525803 podStartE2EDuration="19.382525803s" podCreationTimestamp="2026-03-19 18:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:14.380203423 +0000 UTC m=+164.485233292" watchObservedRunningTime="2026-03-19 18:59:14.382525803 +0000 UTC m=+164.487555652" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.467074 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.4670472119999998 podStartE2EDuration="3.467047212s" podCreationTimestamp="2026-03-19 18:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:14.466764211 +0000 UTC m=+164.571794070" watchObservedRunningTime="2026-03-19 18:59:14.467047212 +0000 UTC m=+164.572077061" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.628903 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb8fbf8-b29b-4b07-a598-915a2c65affa" path="/var/lib/kubelet/pods/0fb8fbf8-b29b-4b07-a598-915a2c65affa/volumes" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.629785 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61959148-db52-4466-b6ee-497bd961721c" path="/var/lib/kubelet/pods/61959148-db52-4466-b6ee-497bd961721c/volumes" Mar 19 18:59:14 crc kubenswrapper[5033]: I0319 18:59:14.630400 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22e8fea-1efc-4ccc-9be6-de3e177717a8" path="/var/lib/kubelet/pods/a22e8fea-1efc-4ccc-9be6-de3e177717a8/volumes" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.282581 5033 generic.go:334] "Generic (PLEG): container finished" podID="bcc92f72-8f71-4a97-9ab2-560a092a29a0" containerID="754f0d75443428f1a5982aa03dc34ff53242a7e3837e6066547a14ef1b22effd" exitCode=0 Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.282656 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bcc92f72-8f71-4a97-9ab2-560a092a29a0","Type":"ContainerDied","Data":"754f0d75443428f1a5982aa03dc34ff53242a7e3837e6066547a14ef1b22effd"} Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.440968 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.442116 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.446833 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.446938 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxg5\" (UniqueName: \"kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.446995 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.447020 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.448356 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.448719 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.449498 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.449753 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.451178 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.456436 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.456705 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.547950 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.547993 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.548069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.548101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxg5\" (UniqueName: \"kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.550760 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.550809 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.557220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.566730 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxg5\" (UniqueName: \"kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5\") pod \"route-controller-manager-7b9ff66c66-6fn42\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.767633 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.928113 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:15 crc kubenswrapper[5033]: I0319 18:59:15.956249 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.565482 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.662178 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access\") pod \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.662372 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir\") pod \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\" (UID: \"bcc92f72-8f71-4a97-9ab2-560a092a29a0\") " Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.662486 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bcc92f72-8f71-4a97-9ab2-560a092a29a0" (UID: "bcc92f72-8f71-4a97-9ab2-560a092a29a0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.662937 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.670201 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bcc92f72-8f71-4a97-9ab2-560a092a29a0" (UID: "bcc92f72-8f71-4a97-9ab2-560a092a29a0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.763899 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bcc92f72-8f71-4a97-9ab2-560a092a29a0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:16 crc kubenswrapper[5033]: I0319 18:59:16.780525 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:16 crc kubenswrapper[5033]: W0319 18:59:16.785515 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32ec908a_df77_42a6_bd5f_f1c9ee56639d.slice/crio-60db44753d77a57fc220238ae70a69163bba7efd12079326614b6573c4049e26 WatchSource:0}: Error finding container 60db44753d77a57fc220238ae70a69163bba7efd12079326614b6573c4049e26: Status 404 returned error can't find the container with id 60db44753d77a57fc220238ae70a69163bba7efd12079326614b6573c4049e26 Mar 19 18:59:17 crc kubenswrapper[5033]: I0319 18:59:17.296515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bcc92f72-8f71-4a97-9ab2-560a092a29a0","Type":"ContainerDied","Data":"328edc4dd448bf24aa6baafb505c93af15cc22badfe250f9cd54364c5d8061e6"} Mar 19 18:59:17 crc kubenswrapper[5033]: I0319 18:59:17.296561 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328edc4dd448bf24aa6baafb505c93af15cc22badfe250f9cd54364c5d8061e6" Mar 19 18:59:17 crc kubenswrapper[5033]: I0319 18:59:17.297035 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:17 crc kubenswrapper[5033]: I0319 18:59:17.298228 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" event={"ID":"32ec908a-df77-42a6-bd5f-f1c9ee56639d","Type":"ContainerStarted","Data":"60db44753d77a57fc220238ae70a69163bba7efd12079326614b6573c4049e26"} Mar 19 18:59:17 crc kubenswrapper[5033]: I0319 18:59:17.298357 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" podUID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" containerName="controller-manager" containerID="cri-o://1fa7af0a539c4c5d86350f2f168522d69bff04645125d85f7e5ab9c65033935f" gracePeriod=30 Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.307393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerStarted","Data":"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1"} Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.309366 5033 generic.go:334] "Generic (PLEG): container finished" podID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" containerID="1fa7af0a539c4c5d86350f2f168522d69bff04645125d85f7e5ab9c65033935f" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.309394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" event={"ID":"9bb4fddb-4f2b-4dd9-977b-4ee590220f79","Type":"ContainerDied","Data":"1fa7af0a539c4c5d86350f2f168522d69bff04645125d85f7e5ab9c65033935f"} Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.329549 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2zhl" podStartSLOduration=4.580071754 podStartE2EDuration="44.329526218s" podCreationTimestamp="2026-03-19 18:58:34 +0000 UTC" firstStartedPulling="2026-03-19 18:58:36.620418734 +0000 UTC m=+126.725448573" lastFinishedPulling="2026-03-19 18:59:16.369873188 +0000 UTC m=+166.474903037" observedRunningTime="2026-03-19 18:59:18.32934527 +0000 UTC m=+168.434375139" watchObservedRunningTime="2026-03-19 18:59:18.329526218 +0000 UTC m=+168.434556067" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.460303 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.486343 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles\") pod \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.486426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl99j\" (UniqueName: \"kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j\") pod \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.486473 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert\") pod \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.488964 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9bb4fddb-4f2b-4dd9-977b-4ee590220f79" (UID: "9bb4fddb-4f2b-4dd9-977b-4ee590220f79"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.489095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca\") pod \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.489195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config\") pod \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\" (UID: \"9bb4fddb-4f2b-4dd9-977b-4ee590220f79\") " Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.489856 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.490726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bb4fddb-4f2b-4dd9-977b-4ee590220f79" (UID: "9bb4fddb-4f2b-4dd9-977b-4ee590220f79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.491109 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config" (OuterVolumeSpecName: "config") pod "9bb4fddb-4f2b-4dd9-977b-4ee590220f79" (UID: "9bb4fddb-4f2b-4dd9-977b-4ee590220f79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.494854 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bb4fddb-4f2b-4dd9-977b-4ee590220f79" (UID: "9bb4fddb-4f2b-4dd9-977b-4ee590220f79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.495431 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j" (OuterVolumeSpecName: "kube-api-access-gl99j") pod "9bb4fddb-4f2b-4dd9-977b-4ee590220f79" (UID: "9bb4fddb-4f2b-4dd9-977b-4ee590220f79"). InnerVolumeSpecName "kube-api-access-gl99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.591013 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl99j\" (UniqueName: \"kubernetes.io/projected/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-kube-api-access-gl99j\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.591039 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.591050 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[5033]: I0319 18:59:18.591061 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb4fddb-4f2b-4dd9-977b-4ee590220f79-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.318664 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerStarted","Data":"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.320868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerStarted","Data":"77a8871a3ceb45b574459da2eadf532d247ec44713b7eb5a8425f70967fb90b6"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.322226 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.322272 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-796bc79b88-58snb" event={"ID":"9bb4fddb-4f2b-4dd9-977b-4ee590220f79","Type":"ContainerDied","Data":"2088c2ff3fba918a9637e0f3daddbdbc6d1439b6746f391653b2513fcffdc30f"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.322329 5033 scope.go:117] "RemoveContainer" containerID="1fa7af0a539c4c5d86350f2f168522d69bff04645125d85f7e5ab9c65033935f" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.327587 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" event={"ID":"32ec908a-df77-42a6-bd5f-f1c9ee56639d","Type":"ContainerStarted","Data":"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.327696 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.327647 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" podUID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" containerName="route-controller-manager" containerID="cri-o://9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a" gracePeriod=30 Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.335956 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.338647 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerStarted","Data":"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.342016 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerStarted","Data":"d42adf2e17cc1eb2320d1e47cea941ce3b083b0e3c3a366e9744228956b129d3"} Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.349805 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dvj7" podStartSLOduration=2.926611685 podStartE2EDuration="43.349777881s" podCreationTimestamp="2026-03-19 18:58:36 +0000 UTC" firstStartedPulling="2026-03-19 18:58:37.648443895 +0000 UTC m=+127.753473744" lastFinishedPulling="2026-03-19 18:59:18.071610091 +0000 UTC m=+168.176639940" observedRunningTime="2026-03-19 18:59:19.345150672 +0000 UTC m=+169.450180521" watchObservedRunningTime="2026-03-19 18:59:19.349777881 +0000 UTC m=+169.454807730" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.371022 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" podStartSLOduration=23.371003625 podStartE2EDuration="23.371003625s" podCreationTimestamp="2026-03-19 18:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:19.367419176 +0000 UTC m=+169.472449045" watchObservedRunningTime="2026-03-19 18:59:19.371003625 +0000 UTC m=+169.476033474" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.416011 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhrx9" podStartSLOduration=3.457539663 podStartE2EDuration="45.41599204s" podCreationTimestamp="2026-03-19 18:58:34 +0000 UTC" firstStartedPulling="2026-03-19 18:58:36.485882945 +0000 UTC m=+126.590912794" lastFinishedPulling="2026-03-19 18:59:18.444335322 +0000 UTC m=+168.549365171" observedRunningTime="2026-03-19 18:59:19.415162018 +0000 UTC m=+169.520191887" watchObservedRunningTime="2026-03-19 18:59:19.41599204 +0000 UTC m=+169.521021889" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447021 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:19 crc kubenswrapper[5033]: E0319 18:59:19.447280 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc92f72-8f71-4a97-9ab2-560a092a29a0" containerName="pruner" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447300 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc92f72-8f71-4a97-9ab2-560a092a29a0" containerName="pruner" Mar 19 18:59:19 crc kubenswrapper[5033]: E0319 18:59:19.447323 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" containerName="controller-manager" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447333 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" containerName="controller-manager" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447498 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc92f72-8f71-4a97-9ab2-560a092a29a0" containerName="pruner" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447528 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" containerName="controller-manager" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.447995 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.450823 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.451063 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.451079 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.451151 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.451234 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.451276 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.459179 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.463222 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.466126 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-978qb" podStartSLOduration=2.632303628 podStartE2EDuration="42.466103034s" podCreationTimestamp="2026-03-19 18:58:37 +0000 UTC" firstStartedPulling="2026-03-19 18:58:38.70190893 +0000 UTC m=+128.806938780" lastFinishedPulling="2026-03-19 18:59:18.535708327 +0000 UTC m=+168.640738186" observedRunningTime="2026-03-19 18:59:19.457101095 +0000 UTC m=+169.562130964" watchObservedRunningTime="2026-03-19 18:59:19.466103034 +0000 UTC m=+169.571132883" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.479308 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.491368 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-796bc79b88-58snb"] Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.514129 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-np9gj" podStartSLOduration=2.441495221 podStartE2EDuration="44.514111167s" podCreationTimestamp="2026-03-19 18:58:35 +0000 UTC" firstStartedPulling="2026-03-19 18:58:36.483700271 +0000 UTC m=+126.588730110" lastFinishedPulling="2026-03-19 18:59:18.556316207 +0000 UTC m=+168.661346056" observedRunningTime="2026-03-19 18:59:19.510779148 +0000 UTC m=+169.615808997" watchObservedRunningTime="2026-03-19 18:59:19.514111167 +0000 UTC m=+169.619141016" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.522280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.522933 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sn2h\" (UniqueName: \"kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.522979 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.526527 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.526592 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.628605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.630668 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sn2h\" (UniqueName: \"kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.630718 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.630841 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.630873 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.629807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.632289 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.637677 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.651321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.655120 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sn2h\" (UniqueName: \"kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h\") pod \"controller-manager-756bf9fcc-4zpkn\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.707204 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.732150 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert\") pod \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.732208 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lxg5\" (UniqueName: \"kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5\") pod \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.732251 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca\") pod \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.732274 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config\") pod \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\" (UID: \"32ec908a-df77-42a6-bd5f-f1c9ee56639d\") " Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.733518 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config" (OuterVolumeSpecName: "config") pod "32ec908a-df77-42a6-bd5f-f1c9ee56639d" (UID: "32ec908a-df77-42a6-bd5f-f1c9ee56639d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.736244 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca" (OuterVolumeSpecName: "client-ca") pod "32ec908a-df77-42a6-bd5f-f1c9ee56639d" (UID: "32ec908a-df77-42a6-bd5f-f1c9ee56639d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.737121 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5" (OuterVolumeSpecName: "kube-api-access-6lxg5") pod "32ec908a-df77-42a6-bd5f-f1c9ee56639d" (UID: "32ec908a-df77-42a6-bd5f-f1c9ee56639d"). InnerVolumeSpecName "kube-api-access-6lxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.737578 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32ec908a-df77-42a6-bd5f-f1c9ee56639d" (UID: "32ec908a-df77-42a6-bd5f-f1c9ee56639d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.813270 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.834233 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lxg5\" (UniqueName: \"kubernetes.io/projected/32ec908a-df77-42a6-bd5f-f1c9ee56639d-kube-api-access-6lxg5\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.834281 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.834292 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ec908a-df77-42a6-bd5f-f1c9ee56639d-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:19 crc kubenswrapper[5033]: I0319 18:59:19.834303 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ec908a-df77-42a6-bd5f-f1c9ee56639d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.223325 5033 csr.go:261] certificate signing request csr-6ssqs is approved, waiting to be issued Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.229755 5033 csr.go:257] certificate signing request csr-6ssqs is issued Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.331216 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.350369 5033 generic.go:334] "Generic (PLEG): container finished" podID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" containerID="9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a" exitCode=0 Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.350423 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" event={"ID":"32ec908a-df77-42a6-bd5f-f1c9ee56639d","Type":"ContainerDied","Data":"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a"} Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.350447 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" event={"ID":"32ec908a-df77-42a6-bd5f-f1c9ee56639d","Type":"ContainerDied","Data":"60db44753d77a57fc220238ae70a69163bba7efd12079326614b6573c4049e26"} Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.350482 5033 scope.go:117] "RemoveContainer" containerID="9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.350589 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.356550 5033 generic.go:334] "Generic (PLEG): container finished" podID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerID="e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075" exitCode=0 Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.356651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerDied","Data":"e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075"} Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.367010 5033 generic.go:334] "Generic (PLEG): container finished" podID="05cd9325-9740-4a70-98a7-3de9ebb30035" containerID="7920ff1eb303c17aa123a03056d83eb05368af06b8d4fbc2b64b3a373ec7144a" exitCode=0 Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.368022 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-gls8c" event={"ID":"05cd9325-9740-4a70-98a7-3de9ebb30035","Type":"ContainerDied","Data":"7920ff1eb303c17aa123a03056d83eb05368af06b8d4fbc2b64b3a373ec7144a"} Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.401687 5033 scope.go:117] "RemoveContainer" containerID="9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a" Mar 19 18:59:20 crc kubenswrapper[5033]: E0319 18:59:20.403800 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a\": container with ID starting with 9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a not found: ID does not exist" containerID="9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.403847 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a"} err="failed to get container status \"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a\": rpc error: code = NotFound desc = could not find container \"9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a\": container with ID starting with 9342baae618f4259ce29f385ce257feb6611b9bc4c0660e8bd257f5b9a3c8a8a not found: ID does not exist" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.438154 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.443089 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ff66c66-6fn42"] Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.454908 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:20 crc kubenswrapper[5033]: E0319 18:59:20.455129 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" containerName="route-controller-manager" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.455142 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" containerName="route-controller-manager" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.455246 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" containerName="route-controller-manager" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.455643 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.461505 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.461587 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.461519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.461860 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.462044 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.462164 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.478232 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.546303 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.546399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.546468 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdbp\" (UniqueName: \"kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.546505 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.627971 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ec908a-df77-42a6-bd5f-f1c9ee56639d" path="/var/lib/kubelet/pods/32ec908a-df77-42a6-bd5f-f1c9ee56639d/volumes" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.628710 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb4fddb-4f2b-4dd9-977b-4ee590220f79" path="/var/lib/kubelet/pods/9bb4fddb-4f2b-4dd9-977b-4ee590220f79/volumes" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.647934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdbp\" (UniqueName: \"kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.648021 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.648053 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.648096 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.648867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.649172 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.656374 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.667213 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdbp\" (UniqueName: \"kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp\") pod \"route-controller-manager-7cb8b76d-hk5v7\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:20 crc kubenswrapper[5033]: I0319 18:59:20.814210 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.175166 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.231376 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 21:01:57.137219232 +0000 UTC Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.231426 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6338h2m35.905796208s for next certificate rotation Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.384395 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" event={"ID":"e53c584e-1cc0-4b75-b39a-2e282fb3cf69","Type":"ContainerStarted","Data":"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767"} Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.384497 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" event={"ID":"e53c584e-1cc0-4b75-b39a-2e282fb3cf69","Type":"ContainerStarted","Data":"9bfa26e4d8f321f4587faa5aeab2a4fe69005acb084f6e863a1be625a9eab121"} Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.384876 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.395345 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.401002 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" event={"ID":"c9303bd3-4fc8-47d3-8e86-d85bb21626f4","Type":"ContainerStarted","Data":"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797"} Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.401295 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" event={"ID":"c9303bd3-4fc8-47d3-8e86-d85bb21626f4","Type":"ContainerStarted","Data":"881380d6eb2be4264ef1a6f3b47bf07e6ba288bfd467e4890ad3384d61d48aa3"} Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.401738 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.403424 5033 patch_prober.go:28] interesting pod/route-controller-manager-7cb8b76d-hk5v7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.403492 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.410538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerStarted","Data":"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2"} Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.440615 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" podStartSLOduration=6.440530607 podStartE2EDuration="6.440530607s" podCreationTimestamp="2026-03-19 18:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:21.419858525 +0000 UTC m=+171.524888384" watchObservedRunningTime="2026-03-19 18:59:21.440530607 +0000 UTC m=+171.545560456" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.450683 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hhm4g" podStartSLOduration=3.056908116 podStartE2EDuration="47.45065663s" podCreationTimestamp="2026-03-19 18:58:34 +0000 UTC" firstStartedPulling="2026-03-19 18:58:36.527988475 +0000 UTC m=+126.633018324" lastFinishedPulling="2026-03-19 18:59:20.921736989 +0000 UTC m=+171.026766838" observedRunningTime="2026-03-19 18:59:21.445348144 +0000 UTC m=+171.550377993" watchObservedRunningTime="2026-03-19 18:59:21.45065663 +0000 UTC m=+171.555686489" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.813360 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.828766 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" podStartSLOduration=5.828752519 podStartE2EDuration="5.828752519s" podCreationTimestamp="2026-03-19 18:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:21.579765159 +0000 UTC m=+171.684795008" watchObservedRunningTime="2026-03-19 18:59:21.828752519 +0000 UTC m=+171.933782368" Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.969356 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxr5x\" (UniqueName: \"kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x\") pod \"05cd9325-9740-4a70-98a7-3de9ebb30035\" (UID: \"05cd9325-9740-4a70-98a7-3de9ebb30035\") " Mar 19 18:59:21 crc kubenswrapper[5033]: I0319 18:59:21.979885 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x" (OuterVolumeSpecName: "kube-api-access-pxr5x") pod "05cd9325-9740-4a70-98a7-3de9ebb30035" (UID: "05cd9325-9740-4a70-98a7-3de9ebb30035"). InnerVolumeSpecName "kube-api-access-pxr5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.070916 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxr5x\" (UniqueName: \"kubernetes.io/projected/05cd9325-9740-4a70-98a7-3de9ebb30035-kube-api-access-pxr5x\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.231790 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 19:29:16.820524382 +0000 UTC Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.231833 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6816h29m54.588694044s for next certificate rotation Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.420170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-gls8c" event={"ID":"05cd9325-9740-4a70-98a7-3de9ebb30035","Type":"ContainerDied","Data":"82e57b8f88c5825e9a8bc576ca0bf3d5a54f57a7aae1cecbc76e4993030c0fd8"} Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.420240 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e57b8f88c5825e9a8bc576ca0bf3d5a54f57a7aae1cecbc76e4993030c0fd8" Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.420303 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-gls8c" Mar 19 18:59:22 crc kubenswrapper[5033]: I0319 18:59:22.427321 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:24 crc kubenswrapper[5033]: I0319 18:59:24.854589 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:59:24 crc kubenswrapper[5033]: I0319 18:59:24.855050 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.003042 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.003133 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.068763 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.069860 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.218130 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.218195 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.255288 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.421471 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.421542 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.474900 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.481405 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.489183 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:25 crc kubenswrapper[5033]: I0319 18:59:25.532429 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.028375 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.028769 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.093075 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.268463 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.385359 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.385415 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.424932 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.460351 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bhrx9" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="registry-server" containerID="cri-o://d42adf2e17cc1eb2320d1e47cea941ce3b083b0e3c3a366e9744228956b129d3" gracePeriod=2 Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.494092 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.512654 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.864750 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:59:27 crc kubenswrapper[5033]: I0319 18:59:27.864981 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-np9gj" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="registry-server" containerID="cri-o://0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98" gracePeriod=2 Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.467836 5033 generic.go:334] "Generic (PLEG): container finished" podID="86345a30-6cc0-4359-8051-e508c71833f4" containerID="d42adf2e17cc1eb2320d1e47cea941ce3b083b0e3c3a366e9744228956b129d3" exitCode=0 Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.467904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerDied","Data":"d42adf2e17cc1eb2320d1e47cea941ce3b083b0e3c3a366e9744228956b129d3"} Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.842235 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.882635 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz\") pod \"86345a30-6cc0-4359-8051-e508c71833f4\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.882764 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities\") pod \"86345a30-6cc0-4359-8051-e508c71833f4\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.882785 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content\") pod \"86345a30-6cc0-4359-8051-e508c71833f4\" (UID: \"86345a30-6cc0-4359-8051-e508c71833f4\") " Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.884223 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities" (OuterVolumeSpecName: "utilities") pod "86345a30-6cc0-4359-8051-e508c71833f4" (UID: "86345a30-6cc0-4359-8051-e508c71833f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.889117 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz" (OuterVolumeSpecName: "kube-api-access-2xbcz") pod "86345a30-6cc0-4359-8051-e508c71833f4" (UID: "86345a30-6cc0-4359-8051-e508c71833f4"). InnerVolumeSpecName "kube-api-access-2xbcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.937901 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86345a30-6cc0-4359-8051-e508c71833f4" (UID: "86345a30-6cc0-4359-8051-e508c71833f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.984356 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.984390 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86345a30-6cc0-4359-8051-e508c71833f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:28 crc kubenswrapper[5033]: I0319 18:59:28.984404 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbcz\" (UniqueName: \"kubernetes.io/projected/86345a30-6cc0-4359-8051-e508c71833f4-kube-api-access-2xbcz\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.175983 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.287770 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities\") pod \"8085051d-b375-4eb4-a7d9-359fb0530c52\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.287882 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfnnl\" (UniqueName: \"kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl\") pod \"8085051d-b375-4eb4-a7d9-359fb0530c52\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.287926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content\") pod \"8085051d-b375-4eb4-a7d9-359fb0530c52\" (UID: \"8085051d-b375-4eb4-a7d9-359fb0530c52\") " Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.293544 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities" (OuterVolumeSpecName: "utilities") pod "8085051d-b375-4eb4-a7d9-359fb0530c52" (UID: "8085051d-b375-4eb4-a7d9-359fb0530c52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.296610 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl" (OuterVolumeSpecName: "kube-api-access-gfnnl") pod "8085051d-b375-4eb4-a7d9-359fb0530c52" (UID: "8085051d-b375-4eb4-a7d9-359fb0530c52"). InnerVolumeSpecName "kube-api-access-gfnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.339861 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8085051d-b375-4eb4-a7d9-359fb0530c52" (UID: "8085051d-b375-4eb4-a7d9-359fb0530c52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.389867 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.389910 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfnnl\" (UniqueName: \"kubernetes.io/projected/8085051d-b375-4eb4-a7d9-359fb0530c52-kube-api-access-gfnnl\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.389925 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8085051d-b375-4eb4-a7d9-359fb0530c52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.482900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhrx9" event={"ID":"86345a30-6cc0-4359-8051-e508c71833f4","Type":"ContainerDied","Data":"1ed49b2d2c6e0f3040b0b800f6960989c5e1bbaf71098993481981a5ef4dc05d"} Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.482949 5033 scope.go:117] "RemoveContainer" containerID="d42adf2e17cc1eb2320d1e47cea941ce3b083b0e3c3a366e9744228956b129d3" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.482969 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhrx9" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.485360 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerStarted","Data":"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01"} Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.488631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerStarted","Data":"5fe58a172cbd0c2c395ab590ba70ccc2448b3659b1aa9a419f3accddff721560"} Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.491023 5033 generic.go:334] "Generic (PLEG): container finished" podID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerID="0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98" exitCode=0 Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.491062 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerDied","Data":"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98"} Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.491085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-np9gj" event={"ID":"8085051d-b375-4eb4-a7d9-359fb0530c52","Type":"ContainerDied","Data":"eff53e488b55902d1f03c66ebbbdf2c5905f4732b5ac731df901ac34619fc8fb"} Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.491156 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-np9gj" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.508317 5033 scope.go:117] "RemoveContainer" containerID="88581cb38f9b611578e93901df6bcc7b34d5764ca56acc9d9ebd9c99317b9080" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.537019 5033 scope.go:117] "RemoveContainer" containerID="03de50e51d88a9f6075957331348b2d126d597f2ef83383125b95a6d01c7d017" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.552604 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.556571 5033 scope.go:117] "RemoveContainer" containerID="0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.563420 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-np9gj"] Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.569690 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.572096 5033 scope.go:117] "RemoveContainer" containerID="fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.572312 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bhrx9"] Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.584906 5033 scope.go:117] "RemoveContainer" containerID="4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.596034 5033 scope.go:117] "RemoveContainer" containerID="0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98" Mar 19 18:59:29 crc kubenswrapper[5033]: E0319 18:59:29.596372 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98\": container with ID starting with 0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98 not found: ID does not exist" containerID="0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.596416 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98"} err="failed to get container status \"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98\": rpc error: code = NotFound desc = could not find container \"0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98\": container with ID starting with 0139f5dacd4ac743b4d7fb23c3cb7c7f41a468865d9dcb5cc12fd495a4d6aa98 not found: ID does not exist" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.596470 5033 scope.go:117] "RemoveContainer" containerID="fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd" Mar 19 18:59:29 crc kubenswrapper[5033]: E0319 18:59:29.596769 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd\": container with ID starting with fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd not found: ID does not exist" containerID="fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.596801 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd"} err="failed to get container status \"fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd\": rpc error: code = NotFound desc = could not find container \"fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd\": container with ID starting with fb0dd720c6ecb17e32e9e2917f263abfa92207fec856c9cf2d425e3e46aff3dd not found: ID does not exist" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.596839 5033 scope.go:117] "RemoveContainer" containerID="4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62" Mar 19 18:59:29 crc kubenswrapper[5033]: E0319 18:59:29.597083 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62\": container with ID starting with 4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62 not found: ID does not exist" containerID="4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62" Mar 19 18:59:29 crc kubenswrapper[5033]: I0319 18:59:29.597105 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62"} err="failed to get container status \"4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62\": rpc error: code = NotFound desc = could not find container \"4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62\": container with ID starting with 4de6eb04d5b671f148849f272c71f700585d7d5290ad1da3cd0b16b797501e62 not found: ID does not exist" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.266416 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.267526 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-978qb" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="registry-server" containerID="cri-o://77a8871a3ceb45b574459da2eadf532d247ec44713b7eb5a8425f70967fb90b6" gracePeriod=2 Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.499856 5033 generic.go:334] "Generic (PLEG): container finished" podID="0cedfa0d-8527-4d20-9326-88bf40011456" containerID="5fe58a172cbd0c2c395ab590ba70ccc2448b3659b1aa9a419f3accddff721560" exitCode=0 Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.499919 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerDied","Data":"5fe58a172cbd0c2c395ab590ba70ccc2448b3659b1aa9a419f3accddff721560"} Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.507170 5033 generic.go:334] "Generic (PLEG): container finished" podID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerID="77a8871a3ceb45b574459da2eadf532d247ec44713b7eb5a8425f70967fb90b6" exitCode=0 Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.507253 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerDied","Data":"77a8871a3ceb45b574459da2eadf532d247ec44713b7eb5a8425f70967fb90b6"} Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.508995 5033 generic.go:334] "Generic (PLEG): container finished" podID="63e3da86-ceaf-47ef-81af-07853efd035b" containerID="f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01" exitCode=0 Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.509036 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerDied","Data":"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01"} Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.656087 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" path="/var/lib/kubelet/pods/8085051d-b375-4eb4-a7d9-359fb0530c52/volumes" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.659320 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86345a30-6cc0-4359-8051-e508c71833f4" path="/var/lib/kubelet/pods/86345a30-6cc0-4359-8051-e508c71833f4/volumes" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.724442 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.806976 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq48d\" (UniqueName: \"kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d\") pod \"f38e7c13-214e-4636-83f2-bf0025afbec3\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.807093 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content\") pod \"f38e7c13-214e-4636-83f2-bf0025afbec3\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.807121 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities\") pod \"f38e7c13-214e-4636-83f2-bf0025afbec3\" (UID: \"f38e7c13-214e-4636-83f2-bf0025afbec3\") " Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.807996 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities" (OuterVolumeSpecName: "utilities") pod "f38e7c13-214e-4636-83f2-bf0025afbec3" (UID: "f38e7c13-214e-4636-83f2-bf0025afbec3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.830635 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d" (OuterVolumeSpecName: "kube-api-access-dq48d") pod "f38e7c13-214e-4636-83f2-bf0025afbec3" (UID: "f38e7c13-214e-4636-83f2-bf0025afbec3"). InnerVolumeSpecName "kube-api-access-dq48d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.834734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f38e7c13-214e-4636-83f2-bf0025afbec3" (UID: "f38e7c13-214e-4636-83f2-bf0025afbec3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.908077 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.908102 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f38e7c13-214e-4636-83f2-bf0025afbec3-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[5033]: I0319 18:59:30.908113 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq48d\" (UniqueName: \"kubernetes.io/projected/f38e7c13-214e-4636-83f2-bf0025afbec3-kube-api-access-dq48d\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.523538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-978qb" event={"ID":"f38e7c13-214e-4636-83f2-bf0025afbec3","Type":"ContainerDied","Data":"a46a70e04562e17b4925586f57e4720cc854af426dbcbd887701b7e023ec2ac7"} Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.523612 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-978qb" Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.523877 5033 scope.go:117] "RemoveContainer" containerID="77a8871a3ceb45b574459da2eadf532d247ec44713b7eb5a8425f70967fb90b6" Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.556919 5033 scope.go:117] "RemoveContainer" containerID="a1349a7c32b6fe2aa5c614b72f3bdb7a179e7f2b96361e7bb0a8bfcb6e63919f" Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.563623 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.572400 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-978qb"] Mar 19 18:59:31 crc kubenswrapper[5033]: I0319 18:59:31.573958 5033 scope.go:117] "RemoveContainer" containerID="a37fad93b3e3d35e755f68ca6cbcc50647025f6e1a9eea7d297f74a3fee8df1e" Mar 19 18:59:32 crc kubenswrapper[5033]: I0319 18:59:32.531310 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerStarted","Data":"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5"} Mar 19 18:59:32 crc kubenswrapper[5033]: I0319 18:59:32.534609 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerStarted","Data":"0e5cc612353a3252fbad92dbfcd4db9ebc90d48b9241603c4013b287891c4887"} Mar 19 18:59:32 crc kubenswrapper[5033]: I0319 18:59:32.547896 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ddsbp" podStartSLOduration=4.177223207 podStartE2EDuration="55.547871986s" podCreationTimestamp="2026-03-19 18:58:37 +0000 UTC" firstStartedPulling="2026-03-19 18:58:39.717065236 +0000 UTC m=+129.822095085" lastFinishedPulling="2026-03-19 18:59:31.087714015 +0000 UTC m=+181.192743864" observedRunningTime="2026-03-19 18:59:32.545746663 +0000 UTC m=+182.650776552" watchObservedRunningTime="2026-03-19 18:59:32.547871986 +0000 UTC m=+182.652901845" Mar 19 18:59:32 crc kubenswrapper[5033]: I0319 18:59:32.563618 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4zm4" podStartSLOduration=2.790012892 podStartE2EDuration="54.563597666s" podCreationTimestamp="2026-03-19 18:58:38 +0000 UTC" firstStartedPulling="2026-03-19 18:58:39.750416028 +0000 UTC m=+129.855445867" lastFinishedPulling="2026-03-19 18:59:31.524000792 +0000 UTC m=+181.629030641" observedRunningTime="2026-03-19 18:59:32.562986772 +0000 UTC m=+182.668016661" watchObservedRunningTime="2026-03-19 18:59:32.563597666 +0000 UTC m=+182.668627525" Mar 19 18:59:32 crc kubenswrapper[5033]: I0319 18:59:32.629683 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" path="/var/lib/kubelet/pods/f38e7c13-214e-4636-83f2-bf0025afbec3/volumes" Mar 19 18:59:35 crc kubenswrapper[5033]: I0319 18:59:35.061045 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 18:59:35 crc kubenswrapper[5033]: I0319 18:59:35.952175 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:35 crc kubenswrapper[5033]: I0319 18:59:35.952430 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" podUID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" containerName="controller-manager" containerID="cri-o://ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767" gracePeriod=30 Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.047594 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.047841 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerName="route-controller-manager" containerID="cri-o://737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797" gracePeriod=30 Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.491756 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.496216 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.557265 5033 generic.go:334] "Generic (PLEG): container finished" podID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" containerID="ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767" exitCode=0 Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.557303 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" event={"ID":"e53c584e-1cc0-4b75-b39a-2e282fb3cf69","Type":"ContainerDied","Data":"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767"} Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.557336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" event={"ID":"e53c584e-1cc0-4b75-b39a-2e282fb3cf69","Type":"ContainerDied","Data":"9bfa26e4d8f321f4587faa5aeab2a4fe69005acb084f6e863a1be625a9eab121"} Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.557331 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-756bf9fcc-4zpkn" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.557349 5033 scope.go:117] "RemoveContainer" containerID="ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.559032 5033 generic.go:334] "Generic (PLEG): container finished" podID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerID="737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797" exitCode=0 Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.559078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" event={"ID":"c9303bd3-4fc8-47d3-8e86-d85bb21626f4","Type":"ContainerDied","Data":"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797"} Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.559163 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" event={"ID":"c9303bd3-4fc8-47d3-8e86-d85bb21626f4","Type":"ContainerDied","Data":"881380d6eb2be4264ef1a6f3b47bf07e6ba288bfd467e4890ad3384d61d48aa3"} Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.559108 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.573086 5033 scope.go:117] "RemoveContainer" containerID="ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767" Mar 19 18:59:36 crc kubenswrapper[5033]: E0319 18:59:36.573579 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767\": container with ID starting with ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767 not found: ID does not exist" containerID="ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.573615 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767"} err="failed to get container status \"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767\": rpc error: code = NotFound desc = could not find container \"ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767\": container with ID starting with ce5bb2dc1bfe3861ec7f02ebc8a91ec31d37c03cac158ba1629e7e829bdf5767 not found: ID does not exist" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.573635 5033 scope.go:117] "RemoveContainer" containerID="737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.578925 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config\") pod \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.578995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config\") pod \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579022 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sn2h\" (UniqueName: \"kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h\") pod \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579072 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert\") pod \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579091 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert\") pod \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579135 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca\") pod \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579156 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca\") pod \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579199 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles\") pod \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\" (UID: \"e53c584e-1cc0-4b75-b39a-2e282fb3cf69\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.579245 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdbp\" (UniqueName: \"kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp\") pod \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\" (UID: \"c9303bd3-4fc8-47d3-8e86-d85bb21626f4\") " Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.580366 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config" (OuterVolumeSpecName: "config") pod "e53c584e-1cc0-4b75-b39a-2e282fb3cf69" (UID: "e53c584e-1cc0-4b75-b39a-2e282fb3cf69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.580413 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config" (OuterVolumeSpecName: "config") pod "c9303bd3-4fc8-47d3-8e86-d85bb21626f4" (UID: "c9303bd3-4fc8-47d3-8e86-d85bb21626f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.580703 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e53c584e-1cc0-4b75-b39a-2e282fb3cf69" (UID: "e53c584e-1cc0-4b75-b39a-2e282fb3cf69"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.580892 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca" (OuterVolumeSpecName: "client-ca") pod "e53c584e-1cc0-4b75-b39a-2e282fb3cf69" (UID: "e53c584e-1cc0-4b75-b39a-2e282fb3cf69"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.580896 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9303bd3-4fc8-47d3-8e86-d85bb21626f4" (UID: "c9303bd3-4fc8-47d3-8e86-d85bb21626f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.584690 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e53c584e-1cc0-4b75-b39a-2e282fb3cf69" (UID: "e53c584e-1cc0-4b75-b39a-2e282fb3cf69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.584697 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9303bd3-4fc8-47d3-8e86-d85bb21626f4" (UID: "c9303bd3-4fc8-47d3-8e86-d85bb21626f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.584770 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp" (OuterVolumeSpecName: "kube-api-access-vcdbp") pod "c9303bd3-4fc8-47d3-8e86-d85bb21626f4" (UID: "c9303bd3-4fc8-47d3-8e86-d85bb21626f4"). InnerVolumeSpecName "kube-api-access-vcdbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.584859 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h" (OuterVolumeSpecName: "kube-api-access-5sn2h") pod "e53c584e-1cc0-4b75-b39a-2e282fb3cf69" (UID: "e53c584e-1cc0-4b75-b39a-2e282fb3cf69"). InnerVolumeSpecName "kube-api-access-5sn2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.595612 5033 scope.go:117] "RemoveContainer" containerID="737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797" Mar 19 18:59:36 crc kubenswrapper[5033]: E0319 18:59:36.596025 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797\": container with ID starting with 737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797 not found: ID does not exist" containerID="737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.596056 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797"} err="failed to get container status \"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797\": rpc error: code = NotFound desc = could not find container \"737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797\": container with ID starting with 737bb7abc5cd49cb6d48db42409780c669d42bac50218de2dfa183d2e514d797 not found: ID does not exist" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679776 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679821 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sn2h\" (UniqueName: \"kubernetes.io/projected/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-kube-api-access-5sn2h\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679837 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679851 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679863 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679873 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679883 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679894 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcdbp\" (UniqueName: \"kubernetes.io/projected/c9303bd3-4fc8-47d3-8e86-d85bb21626f4-kube-api-access-vcdbp\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.679906 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e53c584e-1cc0-4b75-b39a-2e282fb3cf69-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.875323 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.879362 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-756bf9fcc-4zpkn"] Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.890540 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:36 crc kubenswrapper[5033]: I0319 18:59:36.895737 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb8b76d-hk5v7"] Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.460537 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461007 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" containerName="oc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461021 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" containerName="oc" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461031 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461038 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461047 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461055 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461062 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461069 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461076 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461081 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461090 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461095 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461102 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461107 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461115 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461121 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="extract-content" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461129 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461134 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461143 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461149 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="extract-utilities" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461157 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" containerName="controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461162 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" containerName="controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: E0319 18:59:37.461169 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerName="route-controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461176 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerName="route-controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461294 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="86345a30-6cc0-4359-8051-e508c71833f4" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461306 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" containerName="route-controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461316 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" containerName="oc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461324 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38e7c13-214e-4636-83f2-bf0025afbec3" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461331 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" containerName="controller-manager" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461337 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8085051d-b375-4eb4-a7d9-359fb0530c52" containerName="registry-server" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.461764 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.462848 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.463866 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.465206 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.465762 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.465956 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.465970 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.466047 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.466096 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.467418 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.469639 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.469725 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.469934 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.469953 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.473685 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.473732 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.473749 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.481723 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488749 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488848 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwtk\" (UniqueName: \"kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488955 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.488988 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.489028 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgcm\" (UniqueName: \"kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.489050 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.489080 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.589732 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590320 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwtk\" (UniqueName: \"kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590356 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590381 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590434 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgcm\" (UniqueName: \"kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590474 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590502 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.590518 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.591590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.591887 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.591996 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.592276 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.592380 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.594946 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.603169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.611049 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwtk\" (UniqueName: \"kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk\") pod \"route-controller-manager-84bd85978b-kvvw4\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.619254 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgcm\" (UniqueName: \"kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm\") pod \"controller-manager-fcdd876cf-zj9fc\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.776408 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:37 crc kubenswrapper[5033]: I0319 18:59:37.784034 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.017334 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.017385 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.053195 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.226016 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 18:59:38 crc kubenswrapper[5033]: W0319 18:59:38.231534 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275044ae_24c9_4bc4_a531_e31bef0b9596.slice/crio-665ad1f31ffb28a3ce999577f6844797db60123e49fa3788f9b17730b5f22db2 WatchSource:0}: Error finding container 665ad1f31ffb28a3ce999577f6844797db60123e49fa3788f9b17730b5f22db2: Status 404 returned error can't find the container with id 665ad1f31ffb28a3ce999577f6844797db60123e49fa3788f9b17730b5f22db2 Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.237385 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 18:59:38 crc kubenswrapper[5033]: W0319 18:59:38.248127 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405c8840_9fff_49ea_9fda_0ea0f942a7f2.slice/crio-540b497bfbe670e41048028a9d371c9988314ef86cc59c6735ec75c99b9913cd WatchSource:0}: Error finding container 540b497bfbe670e41048028a9d371c9988314ef86cc59c6735ec75c99b9913cd: Status 404 returned error can't find the container with id 540b497bfbe670e41048028a9d371c9988314ef86cc59c6735ec75c99b9913cd Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.422614 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.422678 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.464078 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.570900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" event={"ID":"405c8840-9fff-49ea-9fda-0ea0f942a7f2","Type":"ContainerStarted","Data":"881cf0f7e5e5d838ad2568e58d4bcf43e4294ef6e9ef006d6818985b0c7af22d"} Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.570959 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.570975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" event={"ID":"405c8840-9fff-49ea-9fda-0ea0f942a7f2","Type":"ContainerStarted","Data":"540b497bfbe670e41048028a9d371c9988314ef86cc59c6735ec75c99b9913cd"} Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.572927 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" event={"ID":"275044ae-24c9-4bc4-a531-e31bef0b9596","Type":"ContainerStarted","Data":"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e"} Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.572961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" event={"ID":"275044ae-24c9-4bc4-a531-e31bef0b9596","Type":"ContainerStarted","Data":"665ad1f31ffb28a3ce999577f6844797db60123e49fa3788f9b17730b5f22db2"} Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.581941 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.590118 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" podStartSLOduration=3.590099201 podStartE2EDuration="3.590099201s" podCreationTimestamp="2026-03-19 18:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:38.586606855 +0000 UTC m=+188.691636724" watchObservedRunningTime="2026-03-19 18:59:38.590099201 +0000 UTC m=+188.695129050" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.616160 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.628898 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9303bd3-4fc8-47d3-8e86-d85bb21626f4" path="/var/lib/kubelet/pods/c9303bd3-4fc8-47d3-8e86-d85bb21626f4/volumes" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.629558 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53c584e-1cc0-4b75-b39a-2e282fb3cf69" path="/var/lib/kubelet/pods/e53c584e-1cc0-4b75-b39a-2e282fb3cf69/volumes" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.640141 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 18:59:38 crc kubenswrapper[5033]: I0319 18:59:38.666846 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" podStartSLOduration=2.666825637 podStartE2EDuration="2.666825637s" podCreationTimestamp="2026-03-19 18:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:38.650006065 +0000 UTC m=+188.755035914" watchObservedRunningTime="2026-03-19 18:59:38.666825637 +0000 UTC m=+188.771855486" Mar 19 18:59:39 crc kubenswrapper[5033]: I0319 18:59:39.578624 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:39 crc kubenswrapper[5033]: I0319 18:59:39.587132 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 18:59:40 crc kubenswrapper[5033]: I0319 18:59:40.063420 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:59:40 crc kubenswrapper[5033]: I0319 18:59:40.582996 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4zm4" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="registry-server" containerID="cri-o://0e5cc612353a3252fbad92dbfcd4db9ebc90d48b9241603c4013b287891c4887" gracePeriod=2 Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.591804 5033 generic.go:334] "Generic (PLEG): container finished" podID="0cedfa0d-8527-4d20-9326-88bf40011456" containerID="0e5cc612353a3252fbad92dbfcd4db9ebc90d48b9241603c4013b287891c4887" exitCode=0 Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.591854 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerDied","Data":"0e5cc612353a3252fbad92dbfcd4db9ebc90d48b9241603c4013b287891c4887"} Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.951969 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.963195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities\") pod \"0cedfa0d-8527-4d20-9326-88bf40011456\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.963274 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content\") pod \"0cedfa0d-8527-4d20-9326-88bf40011456\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.963337 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6znjv\" (UniqueName: \"kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv\") pod \"0cedfa0d-8527-4d20-9326-88bf40011456\" (UID: \"0cedfa0d-8527-4d20-9326-88bf40011456\") " Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.964154 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities" (OuterVolumeSpecName: "utilities") pod "0cedfa0d-8527-4d20-9326-88bf40011456" (UID: "0cedfa0d-8527-4d20-9326-88bf40011456"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:41 crc kubenswrapper[5033]: I0319 18:59:41.970713 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv" (OuterVolumeSpecName: "kube-api-access-6znjv") pod "0cedfa0d-8527-4d20-9326-88bf40011456" (UID: "0cedfa0d-8527-4d20-9326-88bf40011456"). InnerVolumeSpecName "kube-api-access-6znjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.065266 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.065314 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6znjv\" (UniqueName: \"kubernetes.io/projected/0cedfa0d-8527-4d20-9326-88bf40011456-kube-api-access-6znjv\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.138280 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cedfa0d-8527-4d20-9326-88bf40011456" (UID: "0cedfa0d-8527-4d20-9326-88bf40011456"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.166655 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cedfa0d-8527-4d20-9326-88bf40011456-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.600509 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4zm4" event={"ID":"0cedfa0d-8527-4d20-9326-88bf40011456","Type":"ContainerDied","Data":"1a77c8ece250db43ba830fa8ead1a2ab8e6ede54c97c58eb73cdb4f33364d59c"} Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.600564 5033 scope.go:117] "RemoveContainer" containerID="0e5cc612353a3252fbad92dbfcd4db9ebc90d48b9241603c4013b287891c4887" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.600568 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4zm4" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.637602 5033 scope.go:117] "RemoveContainer" containerID="5fe58a172cbd0c2c395ab590ba70ccc2448b3659b1aa9a419f3accddff721560" Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.641682 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.643759 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4zm4"] Mar 19 18:59:42 crc kubenswrapper[5033]: I0319 18:59:42.657475 5033 scope.go:117] "RemoveContainer" containerID="1c2a2a15063ab56e9ecaffba6e97c49043b42564f4681d0477cc98b52b4be8ab" Mar 19 18:59:44 crc kubenswrapper[5033]: I0319 18:59:44.631893 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" path="/var/lib/kubelet/pods/0cedfa0d-8527-4d20-9326-88bf40011456/volumes" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.261605 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kd52d"] Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.449561 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.449847 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="registry-server" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.449869 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="registry-server" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.449884 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="extract-utilities" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.449894 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="extract-utilities" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.449908 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="extract-content" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.449916 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="extract-content" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.450065 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cedfa0d-8527-4d20-9326-88bf40011456" containerName="registry-server" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.452534 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.452977 5033 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453004 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453128 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453139 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453150 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453158 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453288 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453295 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453307 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453313 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453320 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453326 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453333 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453338 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453345 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453351 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453359 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453416 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8" gracePeriod=15 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453422 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93" gracePeriod=15 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453477 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5" gracePeriod=15 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453521 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5" gracePeriod=15 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453515 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc" gracePeriod=15 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453555 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453620 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453628 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453719 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453728 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453735 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453743 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453753 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453761 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453768 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453774 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.453847 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453854 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.453945 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.487259 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610612 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610642 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610675 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610692 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.610790 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.674202 5033 generic.go:334] "Generic (PLEG): container finished" podID="67290f23-7880-41c5-a90f-575154eb45da" containerID="6ca7a60b981e237d69450d2a226f6a246f9bcb0ac59c935bec082153e810a336" exitCode=0 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.674291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"67290f23-7880-41c5-a90f-575154eb45da","Type":"ContainerDied","Data":"6ca7a60b981e237d69450d2a226f6a246f9bcb0ac59c935bec082153e810a336"} Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.675154 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.675409 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.675712 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.676506 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.677793 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.678391 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8" exitCode=0 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.678412 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc" exitCode=0 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.678419 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5" exitCode=0 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.678426 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5" exitCode=2 Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.678477 5033 scope.go:117] "RemoveContainer" containerID="f912984affd7dd3f3caf265c41acf79224fb07051ae6d7e3b0cc5b404e1bda1e" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711585 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711674 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711695 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711721 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711821 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.711891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712294 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712495 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712596 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712696 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.712765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: I0319 18:59:51.786281 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:51 crc kubenswrapper[5033]: W0319 18:59:51.804625 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-37cd3f2ba355eea70dce0fdd2a4bfd7b69cc77c986badc3a639a3f05d725f605 WatchSource:0}: Error finding container 37cd3f2ba355eea70dce0fdd2a4bfd7b69cc77c986badc3a639a3f05d725f605: Status 404 returned error can't find the container with id 37cd3f2ba355eea70dce0fdd2a4bfd7b69cc77c986badc3a639a3f05d725f605 Mar 19 18:59:51 crc kubenswrapper[5033]: E0319 18:59:51.807080 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e53355812f979 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,LastTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:59:52 crc kubenswrapper[5033]: I0319 18:59:52.686417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c"} Mar 19 18:59:52 crc kubenswrapper[5033]: I0319 18:59:52.687190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"37cd3f2ba355eea70dce0fdd2a4bfd7b69cc77c986badc3a639a3f05d725f605"} Mar 19 18:59:52 crc kubenswrapper[5033]: I0319 18:59:52.687496 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[5033]: I0319 18:59:52.688412 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[5033]: I0319 18:59:52.690118 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.152258 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.153714 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.154627 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239236 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock\") pod \"67290f23-7880-41c5-a90f-575154eb45da\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239323 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access\") pod \"67290f23-7880-41c5-a90f-575154eb45da\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir\") pod \"67290f23-7880-41c5-a90f-575154eb45da\" (UID: \"67290f23-7880-41c5-a90f-575154eb45da\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239376 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock" (OuterVolumeSpecName: "var-lock") pod "67290f23-7880-41c5-a90f-575154eb45da" (UID: "67290f23-7880-41c5-a90f-575154eb45da"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239661 5033 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.239636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "67290f23-7880-41c5-a90f-575154eb45da" (UID: "67290f23-7880-41c5-a90f-575154eb45da"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.244474 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "67290f23-7880-41c5-a90f-575154eb45da" (UID: "67290f23-7880-41c5-a90f-575154eb45da"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.340760 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67290f23-7880-41c5-a90f-575154eb45da-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.341040 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67290f23-7880-41c5-a90f-575154eb45da-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.697536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"67290f23-7880-41c5-a90f-575154eb45da","Type":"ContainerDied","Data":"46e782e467969bbfde52d3b5a989909f9beead346dc2bc4fc0eca426e4c5948f"} Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.697875 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e782e467969bbfde52d3b5a989909f9beead346dc2bc4fc0eca426e4c5948f" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.697956 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.710930 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.711105 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.832196 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.833132 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.833890 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.834305 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.834819 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.950981 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951039 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951103 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951237 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951309 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951472 5033 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951496 5033 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:53 crc kubenswrapper[5033]: I0319 18:59:53.951505 5033 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.631988 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.708739 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.718169 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93" exitCode=0 Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.718304 5033 scope.go:117] "RemoveContainer" containerID="d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.718390 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.721718 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.722277 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.722854 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.723247 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.723614 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.724015 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.743953 5033 scope.go:117] "RemoveContainer" containerID="d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.803292 5033 scope.go:117] "RemoveContainer" containerID="56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.822818 5033 scope.go:117] "RemoveContainer" containerID="4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.841987 5033 scope.go:117] "RemoveContainer" containerID="31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.866727 5033 scope.go:117] "RemoveContainer" containerID="52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.921865 5033 scope.go:117] "RemoveContainer" containerID="d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.922352 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8\": container with ID starting with d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8 not found: ID does not exist" containerID="d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.922396 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8"} err="failed to get container status \"d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8\": rpc error: code = NotFound desc = could not find container \"d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8\": container with ID starting with d185dd42058abdf1658da40b1ca09c2ca6663deabb872c7e64e20abbe09ab8c8 not found: ID does not exist" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.922434 5033 scope.go:117] "RemoveContainer" containerID="d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.922961 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\": container with ID starting with d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc not found: ID does not exist" containerID="d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.922992 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc"} err="failed to get container status \"d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\": rpc error: code = NotFound desc = could not find container \"d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc\": container with ID starting with d3f017b3bd9133409823261aead7741a5c5b07bab3b045715bda33385b53e1bc not found: ID does not exist" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.923010 5033 scope.go:117] "RemoveContainer" containerID="56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.923435 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\": container with ID starting with 56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5 not found: ID does not exist" containerID="56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.923522 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5"} err="failed to get container status \"56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\": rpc error: code = NotFound desc = could not find container \"56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5\": container with ID starting with 56f0dedbe318c8028a7b5e9165b049618ad69ff45bef2729f8f996196b7de4f5 not found: ID does not exist" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.923563 5033 scope.go:117] "RemoveContainer" containerID="4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.924057 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\": container with ID starting with 4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5 not found: ID does not exist" containerID="4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.924094 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5"} err="failed to get container status \"4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\": rpc error: code = NotFound desc = could not find container \"4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5\": container with ID starting with 4999b4b601ca2752a4b86e881025b1a5f56a0e8cf7f2e6772e990a1bfb579de5 not found: ID does not exist" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.924113 5033 scope.go:117] "RemoveContainer" containerID="31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.924506 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\": container with ID starting with 31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93 not found: ID does not exist" containerID="31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.924580 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93"} err="failed to get container status \"31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\": rpc error: code = NotFound desc = could not find container \"31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93\": container with ID starting with 31b3400f5121bb9242e8dbe3365fdbba7233b604e1c125da643e493bb230aa93 not found: ID does not exist" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.924624 5033 scope.go:117] "RemoveContainer" containerID="52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c" Mar 19 18:59:54 crc kubenswrapper[5033]: E0319 18:59:54.925139 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\": container with ID starting with 52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c not found: ID does not exist" containerID="52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c" Mar 19 18:59:54 crc kubenswrapper[5033]: I0319 18:59:54.925170 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c"} err="failed to get container status \"52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\": rpc error: code = NotFound desc = could not find container \"52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c\": container with ID starting with 52676295d217556f1e192a3607a14687957397d652a2f16ecc1d54f0f61c4b9c not found: ID does not exist" Mar 19 18:59:55 crc kubenswrapper[5033]: E0319 18:59:55.736109 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e53355812f979 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,LastTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.823563 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.824527 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.825065 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.825641 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.826170 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 18:59:59 crc kubenswrapper[5033]: I0319 18:59:59.826202 5033 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 18:59:59 crc kubenswrapper[5033]: E0319 18:59:59.826385 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Mar 19 19:00:00 crc kubenswrapper[5033]: E0319 19:00:00.027585 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Mar 19 19:00:00 crc kubenswrapper[5033]: E0319 19:00:00.428748 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Mar 19 19:00:00 crc kubenswrapper[5033]: I0319 19:00:00.638793 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:00 crc kubenswrapper[5033]: I0319 19:00:00.639524 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:01 crc kubenswrapper[5033]: E0319 19:00:01.229574 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Mar 19 19:00:02 crc kubenswrapper[5033]: E0319 19:00:02.830727 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.793285 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.793400 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="75875553f125bbbd24248852d61935696cf55f7aede9e3a7b10bfbceb78e2d58" exitCode=1 Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.793496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"75875553f125bbbd24248852d61935696cf55f7aede9e3a7b10bfbceb78e2d58"} Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.794357 5033 scope.go:117] "RemoveContainer" containerID="75875553f125bbbd24248852d61935696cf55f7aede9e3a7b10bfbceb78e2d58" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.794598 5033 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.795139 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.795490 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:04 crc kubenswrapper[5033]: I0319 19:00:04.935487 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:05 crc kubenswrapper[5033]: E0319 19:00:05.737662 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e53355812f979 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,LastTimestamp:2026-03-19 18:59:51.806564729 +0000 UTC m=+201.911594578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.804110 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.804164 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"823b10160c04d0d6345540c9dafbc38e20ad5f26f121709fc9816adaa9891d49"} Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.805206 5033 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.805476 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.805775 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.810094 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.810312 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 19:00:05 crc kubenswrapper[5033]: I0319 19:00:05.810384 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 19:00:06 crc kubenswrapper[5033]: E0319 19:00:06.031799 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="6.4s" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.620417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.621272 5033 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.622050 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.622758 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.637114 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.637144 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:06 crc kubenswrapper[5033]: E0319 19:00:06.637533 5033 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.637984 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:06 crc kubenswrapper[5033]: W0319 19:00:06.662884 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-bf656a2da8620d9c3a265c20c9c24d0fcb512ddb2401157a52157cec5b0bb5f8 WatchSource:0}: Error finding container bf656a2da8620d9c3a265c20c9c24d0fcb512ddb2401157a52157cec5b0bb5f8: Status 404 returned error can't find the container with id bf656a2da8620d9c3a265c20c9c24d0fcb512ddb2401157a52157cec5b0bb5f8 Mar 19 19:00:06 crc kubenswrapper[5033]: I0319 19:00:06.810412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf656a2da8620d9c3a265c20c9c24d0fcb512ddb2401157a52157cec5b0bb5f8"} Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.817326 5033 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d3b45c87153dff9c7df9bf4a3b374524dec33cc95ce37e7c7634075b250ecd7a" exitCode=0 Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.817375 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d3b45c87153dff9c7df9bf4a3b374524dec33cc95ce37e7c7634075b250ecd7a"} Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.817581 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.817598 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:07 crc kubenswrapper[5033]: E0319 19:00:07.817912 5033 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.818060 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.818269 5033 status_manager.go:851] "Failed to get status for pod" podUID="67290f23-7880-41c5-a90f-575154eb45da" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:07 crc kubenswrapper[5033]: I0319 19:00:07.818520 5033 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 19 19:00:08 crc kubenswrapper[5033]: I0319 19:00:08.826429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73e3fb8f844412bd92c7d1aa6d08efda2438750f9ba853053ed347df5d67d700"} Mar 19 19:00:08 crc kubenswrapper[5033]: I0319 19:00:08.826830 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9b8e380164e76c92f5807beefea0eb24b6eaf4a8543b274556af27dfd9e745b"} Mar 19 19:00:08 crc kubenswrapper[5033]: I0319 19:00:08.826851 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a0a521a3b16c721abb9ef14a80e236e5f01e43a3b13096515bee7dbb2b676aa0"} Mar 19 19:00:08 crc kubenswrapper[5033]: I0319 19:00:08.826862 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6db93307c0ff821ad0bb77abc706b65e65343243ede6df43f633daadf9e6fcca"} Mar 19 19:00:09 crc kubenswrapper[5033]: I0319 19:00:09.835250 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5529b2ae96d1b3a0caf5c00566e63b9d333c781dff9c108275e6fb14548c223e"} Mar 19 19:00:09 crc kubenswrapper[5033]: I0319 19:00:09.835548 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:09 crc kubenswrapper[5033]: I0319 19:00:09.835594 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:09 crc kubenswrapper[5033]: I0319 19:00:09.835612 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:10 crc kubenswrapper[5033]: I0319 19:00:10.758433 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:00:10 crc kubenswrapper[5033]: I0319 19:00:10.758518 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:00:11 crc kubenswrapper[5033]: I0319 19:00:11.638967 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:11 crc kubenswrapper[5033]: I0319 19:00:11.639025 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:11 crc kubenswrapper[5033]: I0319 19:00:11.647853 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:12 crc kubenswrapper[5033]: I0319 19:00:12.664188 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:14 crc kubenswrapper[5033]: I0319 19:00:14.845875 5033 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:14 crc kubenswrapper[5033]: I0319 19:00:14.910075 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="122a08ef-bda0-4996-9a9f-f85ee4194826" Mar 19 19:00:15 crc kubenswrapper[5033]: I0319 19:00:15.813953 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:15 crc kubenswrapper[5033]: I0319 19:00:15.818686 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:15 crc kubenswrapper[5033]: I0319 19:00:15.870005 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:15 crc kubenswrapper[5033]: I0319 19:00:15.870056 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0718fe93-d69d-4b40-a912-533ed06ad37b" Mar 19 19:00:15 crc kubenswrapper[5033]: I0319 19:00:15.872694 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="122a08ef-bda0-4996-9a9f-f85ee4194826" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.284156 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerName="oauth-openshift" containerID="cri-o://ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a" gracePeriod=15 Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.759510 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817317 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817673 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817796 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.817931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rjj\" (UniqueName: \"kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818001 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818067 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818190 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818249 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818492 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.818630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819256 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819334 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819441 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819514 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca\") pod \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\" (UID: \"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c\") " Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.819970 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.820981 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.821023 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.820370 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.821053 5033 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.824914 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.825182 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.825737 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.826190 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.826220 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.826400 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.830606 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.831418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.833979 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj" (OuterVolumeSpecName: "kube-api-access-45rjj") pod "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" (UID: "a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c"). InnerVolumeSpecName "kube-api-access-45rjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.883936 5033 generic.go:334] "Generic (PLEG): container finished" podID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerID="ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a" exitCode=0 Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.883983 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" event={"ID":"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c","Type":"ContainerDied","Data":"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a"} Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.884014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" event={"ID":"a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c","Type":"ContainerDied","Data":"1d519a9aee51ba50ecf86e6c5de6f3628d46fdb1acc9205b26b115d3ddb34f43"} Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.884011 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kd52d" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.884034 5033 scope.go:117] "RemoveContainer" containerID="ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.910546 5033 scope.go:117] "RemoveContainer" containerID="ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a" Mar 19 19:00:16 crc kubenswrapper[5033]: E0319 19:00:16.912277 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a\": container with ID starting with ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a not found: ID does not exist" containerID="ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.912361 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a"} err="failed to get container status \"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a\": rpc error: code = NotFound desc = could not find container \"ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a\": container with ID starting with ea907624d2bd11a1065a40c89853ed45d8cd27d54ae2ea32855d83521daa0b9a not found: ID does not exist" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922299 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922336 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rjj\" (UniqueName: \"kubernetes.io/projected/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-kube-api-access-45rjj\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922352 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922365 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922379 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922393 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922405 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922418 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922433 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:16 crc kubenswrapper[5033]: I0319 19:00:16.922484 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:23 crc kubenswrapper[5033]: I0319 19:00:23.756367 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 19:00:23 crc kubenswrapper[5033]: I0319 19:00:23.982155 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[5033]: I0319 19:00:24.439883 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 19:00:24 crc kubenswrapper[5033]: I0319 19:00:24.622610 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[5033]: I0319 19:00:24.635540 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 19:00:24 crc kubenswrapper[5033]: I0319 19:00:24.934322 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 19:00:25 crc kubenswrapper[5033]: I0319 19:00:25.227688 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 19:00:25 crc kubenswrapper[5033]: I0319 19:00:25.335442 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 19:00:25 crc kubenswrapper[5033]: I0319 19:00:25.697623 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 19:00:25 crc kubenswrapper[5033]: I0319 19:00:25.931064 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 19:00:25 crc kubenswrapper[5033]: I0319 19:00:25.977588 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.098647 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.167414 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.329321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.646131 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.669744 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.725026 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.921907 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.948546 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.984852 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 19:00:26 crc kubenswrapper[5033]: I0319 19:00:26.997858 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.008004 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.053333 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.323811 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.415302 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.451773 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.585515 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.603312 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.633762 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.635002 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.752762 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.779703 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.837760 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.874743 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.898722 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 19:00:27 crc kubenswrapper[5033]: I0319 19:00:27.946335 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.168396 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.306912 5033 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.384656 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.442648 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.629654 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.678110 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.700857 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.764787 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.796878 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.837239 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.895211 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.901554 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.912048 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.986804 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 19:00:28 crc kubenswrapper[5033]: I0319 19:00:28.988352 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.023872 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.050786 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.102436 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.163973 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.179916 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.180333 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.362927 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.376249 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.463149 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.487936 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.504002 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.620416 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.642852 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.786736 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.906884 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 19:00:29 crc kubenswrapper[5033]: I0319 19:00:29.978433 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.019103 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.115063 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.129092 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.135549 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.151533 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.248616 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.306751 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.362770 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.522716 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.523740 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.567112 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.637922 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.671656 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 19:00:30 crc kubenswrapper[5033]: I0319 19:00:30.839326 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.291742 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.315807 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.355812 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.444412 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.462890 5033 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.479433 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.510796 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.569910 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.605041 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.670698 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.769472 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.799851 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.844239 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.878901 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.881380 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.911949 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.952269 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 19:00:31 crc kubenswrapper[5033]: I0319 19:00:31.997889 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.041876 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.103615 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.154197 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.161007 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.186737 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.308237 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.547668 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.604294 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.604517 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.670005 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.784981 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.808534 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 19:00:32 crc kubenswrapper[5033]: I0319 19:00:32.835825 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.156426 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.263931 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.333667 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.341188 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.389685 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.490235 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.532190 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.534252 5033 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.534952 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.534920643 podStartE2EDuration="42.534920643s" podCreationTimestamp="2026-03-19 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:14.879762208 +0000 UTC m=+224.984792067" watchObservedRunningTime="2026-03-19 19:00:33.534920643 +0000 UTC m=+243.639950532" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.541876 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-kd52d"] Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.541957 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc","openshift-infra/auto-csr-approver-29565780-g2zh9"] Mar 19 19:00:33 crc kubenswrapper[5033]: E0319 19:00:33.542239 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerName="oauth-openshift" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.542283 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerName="oauth-openshift" Mar 19 19:00:33 crc kubenswrapper[5033]: E0319 19:00:33.542317 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67290f23-7880-41c5-a90f-575154eb45da" containerName="installer" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.542334 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="67290f23-7880-41c5-a90f-575154eb45da" containerName="installer" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.542706 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" containerName="oauth-openshift" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.542739 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="67290f23-7880-41c5-a90f-575154eb45da" containerName="installer" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.543634 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.544061 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.547939 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.548534 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.548756 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.549032 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.549170 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.550677 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.551085 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.569554 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.569534119 podStartE2EDuration="19.569534119s" podCreationTimestamp="2026-03-19 19:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:33.567347768 +0000 UTC m=+243.672377657" watchObservedRunningTime="2026-03-19 19:00:33.569534119 +0000 UTC m=+243.674563998" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.622731 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.627128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.627296 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.627400 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2j24\" (UniqueName: \"kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24\") pod \"auto-csr-approver-29565780-g2zh9\" (UID: \"e0d7d36a-f1c2-4969-8368-75376b0c2197\") " pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.627511 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnbhx\" (UniqueName: \"kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.649484 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.661626 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.672989 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.726213 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.727971 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2j24\" (UniqueName: \"kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24\") pod \"auto-csr-approver-29565780-g2zh9\" (UID: \"e0d7d36a-f1c2-4969-8368-75376b0c2197\") " pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.728038 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnbhx\" (UniqueName: \"kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.728083 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.728139 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.729299 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.735009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.740675 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.749156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2j24\" (UniqueName: \"kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24\") pod \"auto-csr-approver-29565780-g2zh9\" (UID: \"e0d7d36a-f1c2-4969-8368-75376b0c2197\") " pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.750852 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnbhx\" (UniqueName: \"kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx\") pod \"collect-profiles-29565780-p8glc\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.792178 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.809538 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.847826 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.851271 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.864156 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.877280 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.884811 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.885540 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 19:00:33 crc kubenswrapper[5033]: I0319 19:00:33.952530 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.025085 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.058196 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.088247 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.391113 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.422855 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.493405 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.532237 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7cf76d985f-llwdv"] Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.533021 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.537172 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.537367 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.538328 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.538873 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.539480 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.539595 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.544325 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.546166 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.548074 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.548101 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.548534 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.548670 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.555944 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.560803 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.573168 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.594056 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.595365 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.637755 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c" path="/var/lib/kubelet/pods/a7c09583-0e89-4ab5-a9d9-4b9fa6b2941c/volumes" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644069 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644142 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-error\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644200 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-session\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644253 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcmj\" (UniqueName: \"kubernetes.io/projected/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-kube-api-access-wwcmj\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644275 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-login\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.644871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-dir\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645003 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645045 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645085 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-policies\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.645243 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.656491 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.738038 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746419 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-error\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746467 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746493 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-session\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746524 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746553 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcmj\" (UniqueName: \"kubernetes.io/projected/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-kube-api-access-wwcmj\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746577 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-login\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-dir\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746781 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-policies\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-dir\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.747876 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.746830 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.747937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-audit-policies\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.748490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.749138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.749565 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.752777 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.753302 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-session\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.753702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.754741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-login\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.755093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.756443 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.756672 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.759234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-v4-0-config-user-template-error\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.769676 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcmj\" (UniqueName: \"kubernetes.io/projected/ad1ed82f-3284-44f9-947c-dc7a5240a8e4-kube-api-access-wwcmj\") pod \"oauth-openshift-7cf76d985f-llwdv\" (UID: \"ad1ed82f-3284-44f9-947c-dc7a5240a8e4\") " pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.865186 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:34 crc kubenswrapper[5033]: I0319 19:00:34.916108 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.057589 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.106882 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.168213 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.172345 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.330867 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.399182 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.471199 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.500577 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.502060 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.526398 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.552692 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.617068 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.739367 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.769977 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.780387 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.875971 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.924725 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.946853 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 19:00:35 crc kubenswrapper[5033]: I0319 19:00:35.986577 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.178276 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.202243 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.234642 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.402279 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.561574 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.628096 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.632921 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.677563 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.697594 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.722868 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.751359 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.772266 5033 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.784826 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.789131 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.844179 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.845443 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 19:00:36 crc kubenswrapper[5033]: I0319 19:00:36.901231 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.022014 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.147253 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.164470 5033 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.164722 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c" gracePeriod=5 Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.213099 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.280146 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.378949 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.619975 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.642927 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.700970 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.743499 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.754797 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.817656 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.851957 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.887986 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 19:00:37 crc kubenswrapper[5033]: I0319 19:00:37.996783 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-g2zh9"] Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.011865 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cf76d985f-llwdv"] Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.026713 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc"] Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.045882 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.052967 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.101659 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.231180 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.266957 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.275136 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.276519 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.365363 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.389904 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cf76d985f-llwdv"] Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.450838 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc"] Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.454522 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-g2zh9"] Mar 19 19:00:38 crc kubenswrapper[5033]: W0319 19:00:38.460267 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d8ddb8_2793_4307_9fbc_327e3ee978dd.slice/crio-f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8 WatchSource:0}: Error finding container f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8: Status 404 returned error can't find the container with id f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8 Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.468292 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 19:00:38 crc kubenswrapper[5033]: W0319 19:00:38.468493 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0d7d36a_f1c2_4969_8368_75376b0c2197.slice/crio-001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea WatchSource:0}: Error finding container 001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea: Status 404 returned error can't find the container with id 001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.569727 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.602039 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.628837 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.690603 5033 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.760800 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.868885 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 19:00:38 crc kubenswrapper[5033]: I0319 19:00:38.910707 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.047576 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" event={"ID":"ad1ed82f-3284-44f9-947c-dc7a5240a8e4","Type":"ContainerStarted","Data":"a1519677b80ba580557acbe411c00265c97e11131ea72418e3f5e1a7029b300d"} Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.047651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" event={"ID":"ad1ed82f-3284-44f9-947c-dc7a5240a8e4","Type":"ContainerStarted","Data":"b78023bd648105b565c85c10ed2ede2060d0163683d715976fd5b4b3440b6cd0"} Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.047883 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.049748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" event={"ID":"e0d7d36a-f1c2-4969-8368-75376b0c2197","Type":"ContainerStarted","Data":"001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea"} Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.051837 5033 generic.go:334] "Generic (PLEG): container finished" podID="63d8ddb8-2793-4307-9fbc-327e3ee978dd" containerID="a8f4fc1bf21b28fa40268c9b4514f791e54eb85644577dd8122ba75f8e9ff28e" exitCode=0 Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.051964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" event={"ID":"63d8ddb8-2793-4307-9fbc-327e3ee978dd","Type":"ContainerDied","Data":"a8f4fc1bf21b28fa40268c9b4514f791e54eb85644577dd8122ba75f8e9ff28e"} Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.051983 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" event={"ID":"63d8ddb8-2793-4307-9fbc-327e3ee978dd","Type":"ContainerStarted","Data":"f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8"} Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.062321 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.075597 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7cf76d985f-llwdv" podStartSLOduration=48.07558367 podStartE2EDuration="48.07558367s" podCreationTimestamp="2026-03-19 18:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:39.075424085 +0000 UTC m=+249.180453934" watchObservedRunningTime="2026-03-19 19:00:39.07558367 +0000 UTC m=+249.180613519" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.153003 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.289372 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.341890 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.401526 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.440476 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.480704 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.552854 5033 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.875392 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.895376 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 19:00:39 crc kubenswrapper[5033]: I0319 19:00:39.981011 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.061040 5033 generic.go:334] "Generic (PLEG): container finished" podID="e0d7d36a-f1c2-4969-8368-75376b0c2197" containerID="e9064855d760b0feef7575b1fc9157300ed4eeec6cd893746000319a93786762" exitCode=0 Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.061352 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" event={"ID":"e0d7d36a-f1c2-4969-8368-75376b0c2197","Type":"ContainerDied","Data":"e9064855d760b0feef7575b1fc9157300ed4eeec6cd893746000319a93786762"} Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.184316 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.251099 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.308160 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.338499 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume\") pod \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.338565 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnbhx\" (UniqueName: \"kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx\") pod \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.338596 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume\") pod \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\" (UID: \"63d8ddb8-2793-4307-9fbc-327e3ee978dd\") " Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.339191 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "63d8ddb8-2793-4307-9fbc-327e3ee978dd" (UID: "63d8ddb8-2793-4307-9fbc-327e3ee978dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.343776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx" (OuterVolumeSpecName: "kube-api-access-gnbhx") pod "63d8ddb8-2793-4307-9fbc-327e3ee978dd" (UID: "63d8ddb8-2793-4307-9fbc-327e3ee978dd"). InnerVolumeSpecName "kube-api-access-gnbhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.345309 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "63d8ddb8-2793-4307-9fbc-327e3ee978dd" (UID: "63d8ddb8-2793-4307-9fbc-327e3ee978dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.440566 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d8ddb8-2793-4307-9fbc-327e3ee978dd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.440605 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnbhx\" (UniqueName: \"kubernetes.io/projected/63d8ddb8-2793-4307-9fbc-327e3ee978dd-kube-api-access-gnbhx\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.440620 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63d8ddb8-2793-4307-9fbc-327e3ee978dd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.561718 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.586217 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.758188 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:00:40 crc kubenswrapper[5033]: I0319 19:00:40.758233 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.012431 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.066902 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.067343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc" event={"ID":"63d8ddb8-2793-4307-9fbc-327e3ee978dd","Type":"ContainerDied","Data":"f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8"} Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.067368 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0e576f09bd482cf440a642a8382c71eea86dcab4f31d51cc1804690dd5abfb8" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.374778 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.552824 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2j24\" (UniqueName: \"kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24\") pod \"e0d7d36a-f1c2-4969-8368-75376b0c2197\" (UID: \"e0d7d36a-f1c2-4969-8368-75376b0c2197\") " Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.557845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24" (OuterVolumeSpecName: "kube-api-access-g2j24") pod "e0d7d36a-f1c2-4969-8368-75376b0c2197" (UID: "e0d7d36a-f1c2-4969-8368-75376b0c2197"). InnerVolumeSpecName "kube-api-access-g2j24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:41 crc kubenswrapper[5033]: I0319 19:00:41.654160 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2j24\" (UniqueName: \"kubernetes.io/projected/e0d7d36a-f1c2-4969-8368-75376b0c2197-kube-api-access-g2j24\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.078535 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" event={"ID":"e0d7d36a-f1c2-4969-8368-75376b0c2197","Type":"ContainerDied","Data":"001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea"} Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.078580 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001af3c5cbb5cfe2f3a0e9de1c401970e78a47f2d1a44b33312769872ca8afea" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.078741 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-g2zh9" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.760313 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.760427 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.870844 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.870908 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.870929 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.870958 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.870989 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871019 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871155 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871125 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871287 5033 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871531 5033 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.871559 5033 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.879844 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.973291 5033 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:42 crc kubenswrapper[5033]: I0319 19:00:42.973846 5033 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.086085 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.086170 5033 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c" exitCode=137 Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.086228 5033 scope.go:117] "RemoveContainer" containerID="be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c" Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.086356 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.116427 5033 scope.go:117] "RemoveContainer" containerID="be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c" Mar 19 19:00:43 crc kubenswrapper[5033]: E0319 19:00:43.116995 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c\": container with ID starting with be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c not found: ID does not exist" containerID="be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c" Mar 19 19:00:43 crc kubenswrapper[5033]: I0319 19:00:43.117028 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c"} err="failed to get container status \"be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c\": rpc error: code = NotFound desc = could not find container \"be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c\": container with ID starting with be5e3da6a0c89f7766db52ad61c0a15eb6bcd04cc2fdf4f6a9c519a2d06ee17c not found: ID does not exist" Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.627755 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.630329 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.643712 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.643764 5033 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="f9b7efb3-1fad-4ed8-b91a-910e261bfdd2" Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.646886 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 19:00:44 crc kubenswrapper[5033]: I0319 19:00:44.646905 5033 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="f9b7efb3-1fad-4ed8-b91a-910e261bfdd2" Mar 19 19:00:50 crc kubenswrapper[5033]: I0319 19:00:50.894179 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 19:00:55 crc kubenswrapper[5033]: I0319 19:00:55.941216 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 19:00:55 crc kubenswrapper[5033]: I0319 19:00:55.941683 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" podUID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" containerName="controller-manager" containerID="cri-o://881cf0f7e5e5d838ad2568e58d4bcf43e4294ef6e9ef006d6818985b0c7af22d" gracePeriod=30 Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.042182 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.042640 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" podUID="275044ae-24c9-4bc4-a531-e31bef0b9596" containerName="route-controller-manager" containerID="cri-o://d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e" gracePeriod=30 Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.158110 5033 generic.go:334] "Generic (PLEG): container finished" podID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" containerID="881cf0f7e5e5d838ad2568e58d4bcf43e4294ef6e9ef006d6818985b0c7af22d" exitCode=0 Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.158154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" event={"ID":"405c8840-9fff-49ea-9fda-0ea0f942a7f2","Type":"ContainerDied","Data":"881cf0f7e5e5d838ad2568e58d4bcf43e4294ef6e9ef006d6818985b0c7af22d"} Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.357537 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.365996 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgcm\" (UniqueName: \"kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm\") pod \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.366057 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert\") pod \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.366109 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles\") pod \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.366130 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca\") pod \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.366165 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config\") pod \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\" (UID: \"405c8840-9fff-49ea-9fda-0ea0f942a7f2\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.367090 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config" (OuterVolumeSpecName: "config") pod "405c8840-9fff-49ea-9fda-0ea0f942a7f2" (UID: "405c8840-9fff-49ea-9fda-0ea0f942a7f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.368199 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "405c8840-9fff-49ea-9fda-0ea0f942a7f2" (UID: "405c8840-9fff-49ea-9fda-0ea0f942a7f2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.368189 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "405c8840-9fff-49ea-9fda-0ea0f942a7f2" (UID: "405c8840-9fff-49ea-9fda-0ea0f942a7f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.380950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "405c8840-9fff-49ea-9fda-0ea0f942a7f2" (UID: "405c8840-9fff-49ea-9fda-0ea0f942a7f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.383397 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm" (OuterVolumeSpecName: "kube-api-access-xfgcm") pod "405c8840-9fff-49ea-9fda-0ea0f942a7f2" (UID: "405c8840-9fff-49ea-9fda-0ea0f942a7f2"). InnerVolumeSpecName "kube-api-access-xfgcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.416604 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.467368 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config\") pod \"275044ae-24c9-4bc4-a531-e31bef0b9596\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.467542 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca\") pod \"275044ae-24c9-4bc4-a531-e31bef0b9596\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.468419 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca" (OuterVolumeSpecName: "client-ca") pod "275044ae-24c9-4bc4-a531-e31bef0b9596" (UID: "275044ae-24c9-4bc4-a531-e31bef0b9596"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.468470 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config" (OuterVolumeSpecName: "config") pod "275044ae-24c9-4bc4-a531-e31bef0b9596" (UID: "275044ae-24c9-4bc4-a531-e31bef0b9596"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.468604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwtk\" (UniqueName: \"kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk\") pod \"275044ae-24c9-4bc4-a531-e31bef0b9596\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.468699 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert\") pod \"275044ae-24c9-4bc4-a531-e31bef0b9596\" (UID: \"275044ae-24c9-4bc4-a531-e31bef0b9596\") " Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469551 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/405c8840-9fff-49ea-9fda-0ea0f942a7f2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469582 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469600 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/275044ae-24c9-4bc4-a531-e31bef0b9596-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469615 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469632 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469648 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/405c8840-9fff-49ea-9fda-0ea0f942a7f2-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.469664 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgcm\" (UniqueName: \"kubernetes.io/projected/405c8840-9fff-49ea-9fda-0ea0f942a7f2-kube-api-access-xfgcm\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.471002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk" (OuterVolumeSpecName: "kube-api-access-rqwtk") pod "275044ae-24c9-4bc4-a531-e31bef0b9596" (UID: "275044ae-24c9-4bc4-a531-e31bef0b9596"). InnerVolumeSpecName "kube-api-access-rqwtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.471656 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "275044ae-24c9-4bc4-a531-e31bef0b9596" (UID: "275044ae-24c9-4bc4-a531-e31bef0b9596"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.571140 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwtk\" (UniqueName: \"kubernetes.io/projected/275044ae-24c9-4bc4-a531-e31bef0b9596-kube-api-access-rqwtk\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.571183 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275044ae-24c9-4bc4-a531-e31bef0b9596-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:56 crc kubenswrapper[5033]: I0319 19:00:56.737008 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.169133 5033 generic.go:334] "Generic (PLEG): container finished" podID="275044ae-24c9-4bc4-a531-e31bef0b9596" containerID="d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e" exitCode=0 Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.169238 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.169347 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" event={"ID":"275044ae-24c9-4bc4-a531-e31bef0b9596","Type":"ContainerDied","Data":"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e"} Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.169900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4" event={"ID":"275044ae-24c9-4bc4-a531-e31bef0b9596","Type":"ContainerDied","Data":"665ad1f31ffb28a3ce999577f6844797db60123e49fa3788f9b17730b5f22db2"} Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.169955 5033 scope.go:117] "RemoveContainer" containerID="d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.173786 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" event={"ID":"405c8840-9fff-49ea-9fda-0ea0f942a7f2","Type":"ContainerDied","Data":"540b497bfbe670e41048028a9d371c9988314ef86cc59c6735ec75c99b9913cd"} Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.173842 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fcdd876cf-zj9fc" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.191767 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.195251 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bd85978b-kvvw4"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.204411 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.208141 5033 scope.go:117] "RemoveContainer" containerID="d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.208354 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fcdd876cf-zj9fc"] Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.208722 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e\": container with ID starting with d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e not found: ID does not exist" containerID="d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.208778 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e"} err="failed to get container status \"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e\": rpc error: code = NotFound desc = could not find container \"d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e\": container with ID starting with d904f427b24e340a78f1f83efff77d781f05f4da48df9e845c03153b49c8f52e not found: ID does not exist" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.208805 5033 scope.go:117] "RemoveContainer" containerID="881cf0f7e5e5d838ad2568e58d4bcf43e4294ef6e9ef006d6818985b0c7af22d" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546030 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.546367 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546386 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.546402 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d7d36a-f1c2-4969-8368-75376b0c2197" containerName="oc" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546410 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d7d36a-f1c2-4969-8368-75376b0c2197" containerName="oc" Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.546422 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d8ddb8-2793-4307-9fbc-327e3ee978dd" containerName="collect-profiles" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546437 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d8ddb8-2793-4307-9fbc-327e3ee978dd" containerName="collect-profiles" Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.546470 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275044ae-24c9-4bc4-a531-e31bef0b9596" containerName="route-controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546478 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="275044ae-24c9-4bc4-a531-e31bef0b9596" containerName="route-controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: E0319 19:00:57.546488 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" containerName="controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.546518 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" containerName="controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.548517 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" containerName="controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.548550 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.548566 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d8ddb8-2793-4307-9fbc-327e3ee978dd" containerName="collect-profiles" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.548573 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="275044ae-24c9-4bc4-a531-e31bef0b9596" containerName="route-controller-manager" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.548585 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d7d36a-f1c2-4969-8368-75376b0c2197" containerName="oc" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.550895 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.551532 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.551934 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.555193 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.558778 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.558953 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559426 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559569 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559441 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559761 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559843 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559911 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.559731 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.560264 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.560407 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.560347 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.564425 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.565834 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582266 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582288 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582401 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582590 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rx9n\" (UniqueName: \"kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582642 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4969\" (UniqueName: \"kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582707 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.582797 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683122 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rx9n\" (UniqueName: \"kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683172 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4969\" (UniqueName: \"kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683232 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683264 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683287 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683316 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683332 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.683350 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.684144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.684292 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.685059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.685441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.686208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.687410 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.688747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.699072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4969\" (UniqueName: \"kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969\") pod \"controller-manager-644b5d58fc-vrkz4\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.699578 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rx9n\" (UniqueName: \"kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n\") pod \"route-controller-manager-7cb6456979-57lbl\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.870014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:57 crc kubenswrapper[5033]: I0319 19:00:57.878964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:00:58 crc kubenswrapper[5033]: I0319 19:00:58.129642 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:00:58 crc kubenswrapper[5033]: I0319 19:00:58.183112 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" event={"ID":"460922de-e94a-4688-87dd-a7ea64a276dc","Type":"ContainerStarted","Data":"5d5b744b80c599bfff8a85221d39979e58d8ef734549ac98c20b6b9293b39899"} Mar 19 19:00:58 crc kubenswrapper[5033]: I0319 19:00:58.278177 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:00:58 crc kubenswrapper[5033]: I0319 19:00:58.627563 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275044ae-24c9-4bc4-a531-e31bef0b9596" path="/var/lib/kubelet/pods/275044ae-24c9-4bc4-a531-e31bef0b9596/volumes" Mar 19 19:00:58 crc kubenswrapper[5033]: I0319 19:00:58.628523 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405c8840-9fff-49ea-9fda-0ea0f942a7f2" path="/var/lib/kubelet/pods/405c8840-9fff-49ea-9fda-0ea0f942a7f2/volumes" Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.189306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" event={"ID":"460922de-e94a-4688-87dd-a7ea64a276dc","Type":"ContainerStarted","Data":"dcfc0ca375fd570d0cbeb28856e70f66cdf95f9ad607e2dc59aae542e0c3d299"} Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.189527 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.192286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" event={"ID":"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b","Type":"ContainerStarted","Data":"9f38ae090fc212baf8bd0b67a1d413860d5bdf364d408b000bb8297226226670"} Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.192323 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" event={"ID":"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b","Type":"ContainerStarted","Data":"61d15b5db073544b58c4a25cc4c6d1568121340074bac97ede1e7a10ee964482"} Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.194066 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.208223 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" podStartSLOduration=4.208206824 podStartE2EDuration="4.208206824s" podCreationTimestamp="2026-03-19 19:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:59.206807798 +0000 UTC m=+269.311837657" watchObservedRunningTime="2026-03-19 19:00:59.208206824 +0000 UTC m=+269.313236673" Mar 19 19:00:59 crc kubenswrapper[5033]: I0319 19:00:59.223244 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" podStartSLOduration=3.223212238 podStartE2EDuration="3.223212238s" podCreationTimestamp="2026-03-19 19:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:59.221813672 +0000 UTC m=+269.326843531" watchObservedRunningTime="2026-03-19 19:00:59.223212238 +0000 UTC m=+269.328242147" Mar 19 19:01:00 crc kubenswrapper[5033]: I0319 19:01:00.171729 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 19:01:00 crc kubenswrapper[5033]: I0319 19:01:00.196727 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:01:00 crc kubenswrapper[5033]: I0319 19:01:00.201980 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:01:05 crc kubenswrapper[5033]: I0319 19:01:05.225083 5033 generic.go:334] "Generic (PLEG): container finished" podID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerID="c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25" exitCode=0 Mar 19 19:01:05 crc kubenswrapper[5033]: I0319 19:01:05.225162 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerDied","Data":"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25"} Mar 19 19:01:05 crc kubenswrapper[5033]: I0319 19:01:05.225802 5033 scope.go:117] "RemoveContainer" containerID="c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25" Mar 19 19:01:05 crc kubenswrapper[5033]: I0319 19:01:05.383316 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 19:01:06 crc kubenswrapper[5033]: I0319 19:01:06.033628 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 19:01:06 crc kubenswrapper[5033]: I0319 19:01:06.233301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerStarted","Data":"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca"} Mar 19 19:01:06 crc kubenswrapper[5033]: I0319 19:01:06.233959 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 19:01:06 crc kubenswrapper[5033]: I0319 19:01:06.236873 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 19:01:07 crc kubenswrapper[5033]: I0319 19:01:07.098476 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 19:01:09 crc kubenswrapper[5033]: I0319 19:01:09.744782 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.142955 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.758223 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.758270 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.758307 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.758803 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:01:10 crc kubenswrapper[5033]: I0319 19:01:10.758849 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208" gracePeriod=600 Mar 19 19:01:11 crc kubenswrapper[5033]: I0319 19:01:11.054025 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 19:01:11 crc kubenswrapper[5033]: I0319 19:01:11.267263 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208" exitCode=0 Mar 19 19:01:11 crc kubenswrapper[5033]: I0319 19:01:11.267325 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208"} Mar 19 19:01:11 crc kubenswrapper[5033]: I0319 19:01:11.267369 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3"} Mar 19 19:01:12 crc kubenswrapper[5033]: I0319 19:01:12.199635 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 19:01:12 crc kubenswrapper[5033]: I0319 19:01:12.819793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 19:01:14 crc kubenswrapper[5033]: I0319 19:01:14.531715 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 19:01:15 crc kubenswrapper[5033]: I0319 19:01:15.968069 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:01:15 crc kubenswrapper[5033]: I0319 19:01:15.968757 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" podUID="460922de-e94a-4688-87dd-a7ea64a276dc" containerName="controller-manager" containerID="cri-o://dcfc0ca375fd570d0cbeb28856e70f66cdf95f9ad607e2dc59aae542e0c3d299" gracePeriod=30 Mar 19 19:01:15 crc kubenswrapper[5033]: I0319 19:01:15.976858 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:01:15 crc kubenswrapper[5033]: I0319 19:01:15.977107 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" podUID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" containerName="route-controller-manager" containerID="cri-o://9f38ae090fc212baf8bd0b67a1d413860d5bdf364d408b000bb8297226226670" gracePeriod=30 Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.301025 5033 generic.go:334] "Generic (PLEG): container finished" podID="460922de-e94a-4688-87dd-a7ea64a276dc" containerID="dcfc0ca375fd570d0cbeb28856e70f66cdf95f9ad607e2dc59aae542e0c3d299" exitCode=0 Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.301212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" event={"ID":"460922de-e94a-4688-87dd-a7ea64a276dc","Type":"ContainerDied","Data":"dcfc0ca375fd570d0cbeb28856e70f66cdf95f9ad607e2dc59aae542e0c3d299"} Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.303103 5033 generic.go:334] "Generic (PLEG): container finished" podID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" containerID="9f38ae090fc212baf8bd0b67a1d413860d5bdf364d408b000bb8297226226670" exitCode=0 Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.303154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" event={"ID":"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b","Type":"ContainerDied","Data":"9f38ae090fc212baf8bd0b67a1d413860d5bdf364d408b000bb8297226226670"} Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.544910 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.557018 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca\") pod \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.557063 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config\") pod \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.557100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert\") pod \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.557128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rx9n\" (UniqueName: \"kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n\") pod \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\" (UID: \"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.558132 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" (UID: "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.558381 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config" (OuterVolumeSpecName: "config") pod "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" (UID: "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.570715 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n" (OuterVolumeSpecName: "kube-api-access-4rx9n") pod "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" (UID: "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b"). InnerVolumeSpecName "kube-api-access-4rx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.572256 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" (UID: "a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.629286 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658157 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert\") pod \"460922de-e94a-4688-87dd-a7ea64a276dc\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658205 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config\") pod \"460922de-e94a-4688-87dd-a7ea64a276dc\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658262 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca\") pod \"460922de-e94a-4688-87dd-a7ea64a276dc\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658295 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles\") pod \"460922de-e94a-4688-87dd-a7ea64a276dc\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658343 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4969\" (UniqueName: \"kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969\") pod \"460922de-e94a-4688-87dd-a7ea64a276dc\" (UID: \"460922de-e94a-4688-87dd-a7ea64a276dc\") " Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658682 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658700 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658710 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.658721 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rx9n\" (UniqueName: \"kubernetes.io/projected/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b-kube-api-access-4rx9n\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.660531 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "460922de-e94a-4688-87dd-a7ea64a276dc" (UID: "460922de-e94a-4688-87dd-a7ea64a276dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.661134 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "460922de-e94a-4688-87dd-a7ea64a276dc" (UID: "460922de-e94a-4688-87dd-a7ea64a276dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.661189 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config" (OuterVolumeSpecName: "config") pod "460922de-e94a-4688-87dd-a7ea64a276dc" (UID: "460922de-e94a-4688-87dd-a7ea64a276dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.662639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969" (OuterVolumeSpecName: "kube-api-access-z4969") pod "460922de-e94a-4688-87dd-a7ea64a276dc" (UID: "460922de-e94a-4688-87dd-a7ea64a276dc"). InnerVolumeSpecName "kube-api-access-z4969". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.665060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "460922de-e94a-4688-87dd-a7ea64a276dc" (UID: "460922de-e94a-4688-87dd-a7ea64a276dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.759367 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.759405 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.759418 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4969\" (UniqueName: \"kubernetes.io/projected/460922de-e94a-4688-87dd-a7ea64a276dc-kube-api-access-z4969\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.759427 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460922de-e94a-4688-87dd-a7ea64a276dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:16 crc kubenswrapper[5033]: I0319 19:01:16.759436 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460922de-e94a-4688-87dd-a7ea64a276dc-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.310648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" event={"ID":"a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b","Type":"ContainerDied","Data":"61d15b5db073544b58c4a25cc4c6d1568121340074bac97ede1e7a10ee964482"} Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.310938 5033 scope.go:117] "RemoveContainer" containerID="9f38ae090fc212baf8bd0b67a1d413860d5bdf364d408b000bb8297226226670" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.310676 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.315706 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" event={"ID":"460922de-e94a-4688-87dd-a7ea64a276dc","Type":"ContainerDied","Data":"5d5b744b80c599bfff8a85221d39979e58d8ef734549ac98c20b6b9293b39899"} Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.315735 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-vrkz4" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.339126 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.343066 5033 scope.go:117] "RemoveContainer" containerID="dcfc0ca375fd570d0cbeb28856e70f66cdf95f9ad607e2dc59aae542e0c3d299" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.344140 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-57lbl"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.355435 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.359342 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-vrkz4"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.563096 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:17 crc kubenswrapper[5033]: E0319 19:01:17.563846 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" containerName="route-controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.563881 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" containerName="route-controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: E0319 19:01:17.563906 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460922de-e94a-4688-87dd-a7ea64a276dc" containerName="controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.563921 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="460922de-e94a-4688-87dd-a7ea64a276dc" containerName="controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.564287 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" containerName="route-controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.564332 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="460922de-e94a-4688-87dd-a7ea64a276dc" containerName="controller-manager" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.566940 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.572294 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.573980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.575730 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.592755 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.592778 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593038 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593303 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593464 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593557 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593652 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593699 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593747 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593702 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.593890 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.594676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.599575 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.614079 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668004 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668089 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668390 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqnk\" (UniqueName: \"kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668781 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668820 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.668974 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.669047 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.669135 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thpm\" (UniqueName: \"kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771111 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771183 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqnk\" (UniqueName: \"kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771219 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771246 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771308 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thpm\" (UniqueName: \"kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.771475 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.773184 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.773308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.774189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.775500 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.775579 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.776352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.780035 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.790279 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thpm\" (UniqueName: \"kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm\") pod \"route-controller-manager-dbb5d498c-7kgzx\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.800335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqnk\" (UniqueName: \"kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk\") pod \"controller-manager-746f59449f-zgznt\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.937618 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:17 crc kubenswrapper[5033]: I0319 19:01:17.945582 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:18 crc kubenswrapper[5033]: I0319 19:01:18.387345 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:18 crc kubenswrapper[5033]: W0319 19:01:18.396785 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e4af7ce_f96a_4b4b_a0aa_48c10c42a27e.slice/crio-141c480d29a0c744c023337fa563d739130909aab052f328d2ac6ad385ebb665 WatchSource:0}: Error finding container 141c480d29a0c744c023337fa563d739130909aab052f328d2ac6ad385ebb665: Status 404 returned error can't find the container with id 141c480d29a0c744c023337fa563d739130909aab052f328d2ac6ad385ebb665 Mar 19 19:01:18 crc kubenswrapper[5033]: I0319 19:01:18.429585 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:01:18 crc kubenswrapper[5033]: W0319 19:01:18.435068 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf208ab47_0aa9_4386_bd40_098dc2930be6.slice/crio-ed7cb6c31be9a6ef62b463cc9c2ceddff5dc05fdf0eabf8a478203e9e2dfb08a WatchSource:0}: Error finding container ed7cb6c31be9a6ef62b463cc9c2ceddff5dc05fdf0eabf8a478203e9e2dfb08a: Status 404 returned error can't find the container with id ed7cb6c31be9a6ef62b463cc9c2ceddff5dc05fdf0eabf8a478203e9e2dfb08a Mar 19 19:01:18 crc kubenswrapper[5033]: I0319 19:01:18.632243 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460922de-e94a-4688-87dd-a7ea64a276dc" path="/var/lib/kubelet/pods/460922de-e94a-4688-87dd-a7ea64a276dc/volumes" Mar 19 19:01:18 crc kubenswrapper[5033]: I0319 19:01:18.633219 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b" path="/var/lib/kubelet/pods/a3bb0d0f-5bd7-4a1f-9a1e-afdc33b47f5b/volumes" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.329349 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" event={"ID":"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e","Type":"ContainerStarted","Data":"61230a468f8899b7ddfc41de010eb9ad3e7880f3d582ec3f584d64813e1bfa55"} Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.329627 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.329639 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" event={"ID":"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e","Type":"ContainerStarted","Data":"141c480d29a0c744c023337fa563d739130909aab052f328d2ac6ad385ebb665"} Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.330683 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" event={"ID":"f208ab47-0aa9-4386-bd40-098dc2930be6","Type":"ContainerStarted","Data":"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944"} Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.330730 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" event={"ID":"f208ab47-0aa9-4386-bd40-098dc2930be6","Type":"ContainerStarted","Data":"ed7cb6c31be9a6ef62b463cc9c2ceddff5dc05fdf0eabf8a478203e9e2dfb08a"} Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.331071 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.333496 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.334436 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.347020 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" podStartSLOduration=4.346964902 podStartE2EDuration="4.346964902s" podCreationTimestamp="2026-03-19 19:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:19.34648354 +0000 UTC m=+289.451513399" watchObservedRunningTime="2026-03-19 19:01:19.346964902 +0000 UTC m=+289.451994771" Mar 19 19:01:19 crc kubenswrapper[5033]: I0319 19:01:19.362248 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" podStartSLOduration=3.362229513 podStartE2EDuration="3.362229513s" podCreationTimestamp="2026-03-19 19:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:19.360134559 +0000 UTC m=+289.465164408" watchObservedRunningTime="2026-03-19 19:01:19.362229513 +0000 UTC m=+289.467259382" Mar 19 19:01:56 crc kubenswrapper[5033]: I0319 19:01:56.428198 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:56 crc kubenswrapper[5033]: I0319 19:01:56.429044 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" podUID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" containerName="controller-manager" containerID="cri-o://61230a468f8899b7ddfc41de010eb9ad3e7880f3d582ec3f584d64813e1bfa55" gracePeriod=30 Mar 19 19:01:56 crc kubenswrapper[5033]: I0319 19:01:56.554437 5033 generic.go:334] "Generic (PLEG): container finished" podID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" containerID="61230a468f8899b7ddfc41de010eb9ad3e7880f3d582ec3f584d64813e1bfa55" exitCode=0 Mar 19 19:01:56 crc kubenswrapper[5033]: I0319 19:01:56.554502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" event={"ID":"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e","Type":"ContainerDied","Data":"61230a468f8899b7ddfc41de010eb9ad3e7880f3d582ec3f584d64813e1bfa55"} Mar 19 19:01:56 crc kubenswrapper[5033]: I0319 19:01:56.895185 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.041540 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert\") pod \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.041572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config\") pod \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.041606 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqnk\" (UniqueName: \"kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk\") pod \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.041656 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca\") pod \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.041715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles\") pod \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\" (UID: \"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e\") " Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.042433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" (UID: "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.042841 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" (UID: "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.042891 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config" (OuterVolumeSpecName: "config") pod "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" (UID: "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.048049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk" (OuterVolumeSpecName: "kube-api-access-vzqnk") pod "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" (UID: "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e"). InnerVolumeSpecName "kube-api-access-vzqnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.057429 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" (UID: "0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.143554 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.143636 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.143670 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.143694 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.143720 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqnk\" (UniqueName: \"kubernetes.io/projected/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e-kube-api-access-vzqnk\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.560407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" event={"ID":"0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e","Type":"ContainerDied","Data":"141c480d29a0c744c023337fa563d739130909aab052f328d2ac6ad385ebb665"} Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.560480 5033 scope.go:117] "RemoveContainer" containerID="61230a468f8899b7ddfc41de010eb9ad3e7880f3d582ec3f584d64813e1bfa55" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.560585 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-746f59449f-zgznt" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.582128 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-4d5gr"] Mar 19 19:01:57 crc kubenswrapper[5033]: E0319 19:01:57.582330 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" containerName="controller-manager" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.582345 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" containerName="controller-manager" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.595630 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" containerName="controller-manager" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.596818 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.600511 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.600797 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.608267 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.608580 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.608744 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.608920 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.611811 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.612719 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-4d5gr"] Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.623351 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.626658 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-746f59449f-zgznt"] Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.750157 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-serving-cert\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.750231 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.750270 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-client-ca\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.750306 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstkt\" (UniqueName: \"kubernetes.io/projected/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-kube-api-access-lstkt\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.750431 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-config\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.851591 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.851660 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-client-ca\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.851715 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstkt\" (UniqueName: \"kubernetes.io/projected/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-kube-api-access-lstkt\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.851774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-config\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.851882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-serving-cert\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.852691 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-proxy-ca-bundles\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.853483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-config\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.854226 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-client-ca\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.858224 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-serving-cert\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.872285 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstkt\" (UniqueName: \"kubernetes.io/projected/2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7-kube-api-access-lstkt\") pod \"controller-manager-644b5d58fc-4d5gr\" (UID: \"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7\") " pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:57 crc kubenswrapper[5033]: I0319 19:01:57.923538 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.300737 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-644b5d58fc-4d5gr"] Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.566313 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" event={"ID":"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7","Type":"ContainerStarted","Data":"c3c0aa3208a389b3104c6746f919707213ae9da1cc76f1a4d51a3bdbed36a590"} Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.566724 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.566741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" event={"ID":"2c8bff2b-1df0-44f5-bdf5-3ae0dd8086f7","Type":"ContainerStarted","Data":"e43cdafb99a14d9f9c057db8c757177d1c5c941c2dbfb81a23f496510edd218f"} Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.576345 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.596542 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-644b5d58fc-4d5gr" podStartSLOduration=2.596522273 podStartE2EDuration="2.596522273s" podCreationTimestamp="2026-03-19 19:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:58.593255598 +0000 UTC m=+328.698285477" watchObservedRunningTime="2026-03-19 19:01:58.596522273 +0000 UTC m=+328.701552112" Mar 19 19:01:58 crc kubenswrapper[5033]: I0319 19:01:58.627617 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e" path="/var/lib/kubelet/pods/0e4af7ce-f96a-4b4b-a0aa-48c10c42a27e/volumes" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.124951 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565782-s7btv"] Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.125948 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.128402 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.129747 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.132634 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.134329 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-s7btv"] Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.282535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcv6\" (UniqueName: \"kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6\") pod \"auto-csr-approver-29565782-s7btv\" (UID: \"8f376a12-8710-4832-9d22-014c54f33dfd\") " pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.385003 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcv6\" (UniqueName: \"kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6\") pod \"auto-csr-approver-29565782-s7btv\" (UID: \"8f376a12-8710-4832-9d22-014c54f33dfd\") " pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.407722 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcv6\" (UniqueName: \"kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6\") pod \"auto-csr-approver-29565782-s7btv\" (UID: \"8f376a12-8710-4832-9d22-014c54f33dfd\") " pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.440793 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:00 crc kubenswrapper[5033]: I0319 19:02:00.875914 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-s7btv"] Mar 19 19:02:01 crc kubenswrapper[5033]: I0319 19:02:01.590480 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-s7btv" event={"ID":"8f376a12-8710-4832-9d22-014c54f33dfd","Type":"ContainerStarted","Data":"667fb87eea96721f387906f6a15e622e2cbc736b6c461c6c594e90a415b229e6"} Mar 19 19:02:04 crc kubenswrapper[5033]: I0319 19:02:04.614664 5033 generic.go:334] "Generic (PLEG): container finished" podID="8f376a12-8710-4832-9d22-014c54f33dfd" containerID="9b24b8d1b2d10a3ef83856d75f1db6cb89c45753c951b265f963db706c337903" exitCode=0 Mar 19 19:02:04 crc kubenswrapper[5033]: I0319 19:02:04.614740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-s7btv" event={"ID":"8f376a12-8710-4832-9d22-014c54f33dfd","Type":"ContainerDied","Data":"9b24b8d1b2d10a3ef83856d75f1db6cb89c45753c951b265f963db706c337903"} Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.040349 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.157726 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhcv6\" (UniqueName: \"kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6\") pod \"8f376a12-8710-4832-9d22-014c54f33dfd\" (UID: \"8f376a12-8710-4832-9d22-014c54f33dfd\") " Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.164769 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6" (OuterVolumeSpecName: "kube-api-access-nhcv6") pod "8f376a12-8710-4832-9d22-014c54f33dfd" (UID: "8f376a12-8710-4832-9d22-014c54f33dfd"). InnerVolumeSpecName "kube-api-access-nhcv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.259562 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhcv6\" (UniqueName: \"kubernetes.io/projected/8f376a12-8710-4832-9d22-014c54f33dfd-kube-api-access-nhcv6\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.635618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-s7btv" event={"ID":"8f376a12-8710-4832-9d22-014c54f33dfd","Type":"ContainerDied","Data":"667fb87eea96721f387906f6a15e622e2cbc736b6c461c6c594e90a415b229e6"} Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.635685 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667fb87eea96721f387906f6a15e622e2cbc736b6c461c6c594e90a415b229e6" Mar 19 19:02:06 crc kubenswrapper[5033]: I0319 19:02:06.635712 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-s7btv" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.670155 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2sckc"] Mar 19 19:02:15 crc kubenswrapper[5033]: E0319 19:02:15.670840 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f376a12-8710-4832-9d22-014c54f33dfd" containerName="oc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.670854 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f376a12-8710-4832-9d22-014c54f33dfd" containerName="oc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.670973 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f376a12-8710-4832-9d22-014c54f33dfd" containerName="oc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.671324 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.694208 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2sckc"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.746274 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.746490 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hhm4g" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="registry-server" containerID="cri-o://f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.751523 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.751746 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2zhl" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="registry-server" containerID="cri-o://6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.763014 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.763215 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" containerID="cri-o://c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.780283 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-certificates\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.780362 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a8ebf9-7711-48a2-a0bd-eb497fd74770-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.780509 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.780745 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781022 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dvj7" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="registry-server" containerID="cri-o://a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781352 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghxh\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-kube-api-access-mghxh\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-trusted-ca\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781419 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-tls\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781444 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a8ebf9-7711-48a2-a0bd-eb497fd74770-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.781493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-bound-sa-token\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.793240 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpkw6"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.794431 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.801524 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.801761 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ddsbp" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="registry-server" containerID="cri-o://abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.806442 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpkw6"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.827638 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.882991 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghxh\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-kube-api-access-mghxh\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883270 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-trusted-ca\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-tls\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883313 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a8ebf9-7711-48a2-a0bd-eb497fd74770-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-bound-sa-token\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883373 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-certificates\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883394 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a8ebf9-7711-48a2-a0bd-eb497fd74770-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.883771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a8ebf9-7711-48a2-a0bd-eb497fd74770-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.885340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-trusted-ca\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.885362 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-certificates\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.899373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a8ebf9-7711-48a2-a0bd-eb497fd74770-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.900305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-registry-tls\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.906564 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-bound-sa-token\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.908165 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghxh\" (UniqueName: \"kubernetes.io/projected/42a8ebf9-7711-48a2-a0bd-eb497fd74770-kube-api-access-mghxh\") pod \"image-registry-66df7c8f76-2sckc\" (UID: \"42a8ebf9-7711-48a2-a0bd-eb497fd74770\") " pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.941878 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.942080 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" podUID="f208ab47-0aa9-4386-bd40-098dc2930be6" containerName="route-controller-manager" containerID="cri-o://1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944" gracePeriod=30 Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.985717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rp4\" (UniqueName: \"kubernetes.io/projected/3a73e1d1-a896-4aa7-bba9-c372ab716534-kube-api-access-48rp4\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.985773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.985795 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:15 crc kubenswrapper[5033]: I0319 19:02:15.995926 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.087997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rp4\" (UniqueName: \"kubernetes.io/projected/3a73e1d1-a896-4aa7-bba9-c372ab716534-kube-api-access-48rp4\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.088230 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.088250 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.089700 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.100815 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3a73e1d1-a896-4aa7-bba9-c372ab716534-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.107099 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rp4\" (UniqueName: \"kubernetes.io/projected/3a73e1d1-a896-4aa7-bba9-c372ab716534-kube-api-access-48rp4\") pod \"marketplace-operator-79b997595-xpkw6\" (UID: \"3a73e1d1-a896-4aa7-bba9-c372ab716534\") " pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.243165 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.246223 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.390882 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnfqz\" (UniqueName: \"kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz\") pod \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.390993 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content\") pod \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.391064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities\") pod \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\" (UID: \"ab53fd0b-8294-4dd0-a434-ab7eaff0e360\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.392596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities" (OuterVolumeSpecName: "utilities") pod "ab53fd0b-8294-4dd0-a434-ab7eaff0e360" (UID: "ab53fd0b-8294-4dd0-a434-ab7eaff0e360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.394363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz" (OuterVolumeSpecName: "kube-api-access-jnfqz") pod "ab53fd0b-8294-4dd0-a434-ab7eaff0e360" (UID: "ab53fd0b-8294-4dd0-a434-ab7eaff0e360"). InnerVolumeSpecName "kube-api-access-jnfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.470125 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab53fd0b-8294-4dd0-a434-ab7eaff0e360" (UID: "ab53fd0b-8294-4dd0-a434-ab7eaff0e360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.493701 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.493737 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnfqz\" (UniqueName: \"kubernetes.io/projected/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-kube-api-access-jnfqz\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.493750 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab53fd0b-8294-4dd0-a434-ab7eaff0e360-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.521208 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.528369 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.584936 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.609628 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.617481 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.688875 5033 generic.go:334] "Generic (PLEG): container finished" podID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerID="a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.688943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerDied","Data":"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.688960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dvj7" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.688989 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dvj7" event={"ID":"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6","Type":"ContainerDied","Data":"0a19cef964dbd07c8a4efe5430e10a09df493856959374e7b270972475ef49c3"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.689008 5033 scope.go:117] "RemoveContainer" containerID="a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.691263 5033 generic.go:334] "Generic (PLEG): container finished" podID="63e3da86-ceaf-47ef-81af-07853efd035b" containerID="abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.691306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerDied","Data":"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.691333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddsbp" event={"ID":"63e3da86-ceaf-47ef-81af-07853efd035b","Type":"ContainerDied","Data":"77a58f0bf88a6a8bb03229679d5a3f475d35eea59dac432fd9ec639f6fa9ec3c"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.691415 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddsbp" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.695389 5033 generic.go:334] "Generic (PLEG): container finished" podID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerID="6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.695482 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2zhl" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.695504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerDied","Data":"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.695540 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2zhl" event={"ID":"c8ee9ca5-b342-4011-8b36-d2d00314d515","Type":"ContainerDied","Data":"8dbf8fa3af2616e8a86b072d81a8cba2c773bb8524a0c88854a21fd504612496"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert\") pod \"f208ab47-0aa9-4386-bd40-098dc2930be6\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696532 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content\") pod \"c8ee9ca5-b342-4011-8b36-d2d00314d515\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ps22\" (UniqueName: \"kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22\") pod \"c8ee9ca5-b342-4011-8b36-d2d00314d515\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696760 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config\") pod \"f208ab47-0aa9-4386-bd40-098dc2930be6\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696856 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities\") pod \"63e3da86-ceaf-47ef-81af-07853efd035b\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.696950 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqx4m\" (UniqueName: \"kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m\") pod \"63e3da86-ceaf-47ef-81af-07853efd035b\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.697045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content\") pod \"63e3da86-ceaf-47ef-81af-07853efd035b\" (UID: \"63e3da86-ceaf-47ef-81af-07853efd035b\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.697123 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5thpm\" (UniqueName: \"kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm\") pod \"f208ab47-0aa9-4386-bd40-098dc2930be6\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.697210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca\") pod \"f208ab47-0aa9-4386-bd40-098dc2930be6\" (UID: \"f208ab47-0aa9-4386-bd40-098dc2930be6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.697285 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities\") pod \"c8ee9ca5-b342-4011-8b36-d2d00314d515\" (UID: \"c8ee9ca5-b342-4011-8b36-d2d00314d515\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.698273 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities" (OuterVolumeSpecName: "utilities") pod "63e3da86-ceaf-47ef-81af-07853efd035b" (UID: "63e3da86-ceaf-47ef-81af-07853efd035b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.698287 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities" (OuterVolumeSpecName: "utilities") pod "c8ee9ca5-b342-4011-8b36-d2d00314d515" (UID: "c8ee9ca5-b342-4011-8b36-d2d00314d515"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.698751 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config" (OuterVolumeSpecName: "config") pod "f208ab47-0aa9-4386-bd40-098dc2930be6" (UID: "f208ab47-0aa9-4386-bd40-098dc2930be6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.698900 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f208ab47-0aa9-4386-bd40-098dc2930be6" (UID: "f208ab47-0aa9-4386-bd40-098dc2930be6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.702592 5033 generic.go:334] "Generic (PLEG): container finished" podID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerID="c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.702733 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.703201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerDied","Data":"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.703255 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqvsx" event={"ID":"ce5d4445-04fd-4c00-8b1e-393386cc78ad","Type":"ContainerDied","Data":"91430cedea52a863238a7b873da6ca75cf930deee7791d012b3c6df9956382a2"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.710240 5033 generic.go:334] "Generic (PLEG): container finished" podID="f208ab47-0aa9-4386-bd40-098dc2930be6" containerID="1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.710348 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.710372 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" event={"ID":"f208ab47-0aa9-4386-bd40-098dc2930be6","Type":"ContainerDied","Data":"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.710408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx" event={"ID":"f208ab47-0aa9-4386-bd40-098dc2930be6","Type":"ContainerDied","Data":"ed7cb6c31be9a6ef62b463cc9c2ceddff5dc05fdf0eabf8a478203e9e2dfb08a"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716057 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m" (OuterVolumeSpecName: "kube-api-access-wqx4m") pod "63e3da86-ceaf-47ef-81af-07853efd035b" (UID: "63e3da86-ceaf-47ef-81af-07853efd035b"). InnerVolumeSpecName "kube-api-access-wqx4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716395 5033 scope.go:117] "RemoveContainer" containerID="3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716504 5033 generic.go:334] "Generic (PLEG): container finished" podID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerID="f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2" exitCode=0 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716543 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerDied","Data":"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhm4g" event={"ID":"ab53fd0b-8294-4dd0-a434-ab7eaff0e360","Type":"ContainerDied","Data":"184926c058485d1aeafbf8355983ef4916ea6fe35b71d55cf278023ab36b1c46"} Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.716569 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhm4g" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.735970 5033 scope.go:117] "RemoveContainer" containerID="64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.736287 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.737426 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm" (OuterVolumeSpecName: "kube-api-access-5thpm") pod "f208ab47-0aa9-4386-bd40-098dc2930be6" (UID: "f208ab47-0aa9-4386-bd40-098dc2930be6"). InnerVolumeSpecName "kube-api-access-5thpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.737480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f208ab47-0aa9-4386-bd40-098dc2930be6" (UID: "f208ab47-0aa9-4386-bd40-098dc2930be6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.738525 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22" (OuterVolumeSpecName: "kube-api-access-9ps22") pod "c8ee9ca5-b342-4011-8b36-d2d00314d515" (UID: "c8ee9ca5-b342-4011-8b36-d2d00314d515"). InnerVolumeSpecName "kube-api-access-9ps22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.742592 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hhm4g"] Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.750626 5033 scope.go:117] "RemoveContainer" containerID="a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.751208 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e\": container with ID starting with a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e not found: ID does not exist" containerID="a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.751273 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e"} err="failed to get container status \"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e\": rpc error: code = NotFound desc = could not find container \"a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e\": container with ID starting with a2ad7493f98c831e20c73654278050998d70f227edcdab0fbf2987ea83defc8e not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.751325 5033 scope.go:117] "RemoveContainer" containerID="3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.752332 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d\": container with ID starting with 3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d not found: ID does not exist" containerID="3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.752368 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d"} err="failed to get container status \"3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d\": rpc error: code = NotFound desc = could not find container \"3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d\": container with ID starting with 3fd1a114b4728c990f5da759f1891daa306cedc0fcf045e776b6090f5855114d not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.752390 5033 scope.go:117] "RemoveContainer" containerID="64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.753190 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271\": container with ID starting with 64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271 not found: ID does not exist" containerID="64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.753236 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271"} err="failed to get container status \"64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271\": rpc error: code = NotFound desc = could not find container \"64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271\": container with ID starting with 64bf3494c34c4e3bfd4af59b2b1c9c4ffbe4b5a935e3f044faec0a0bc521a271 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.753250 5033 scope.go:117] "RemoveContainer" containerID="abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.773339 5033 scope.go:117] "RemoveContainer" containerID="f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.774361 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8ee9ca5-b342-4011-8b36-d2d00314d515" (UID: "c8ee9ca5-b342-4011-8b36-d2d00314d515"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.801246 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2sckc"] Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.801559 5033 scope.go:117] "RemoveContainer" containerID="063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.802901 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities\") pod \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.802939 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content\") pod \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.802975 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca\") pod \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics\") pod \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803072 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89nkq\" (UniqueName: \"kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq\") pod \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\" (UID: \"70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803118 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nrxm\" (UniqueName: \"kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm\") pod \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\" (UID: \"ce5d4445-04fd-4c00-8b1e-393386cc78ad\") " Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803479 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f208ab47-0aa9-4386-bd40-098dc2930be6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803497 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803511 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ps22\" (UniqueName: \"kubernetes.io/projected/c8ee9ca5-b342-4011-8b36-d2d00314d515-kube-api-access-9ps22\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803526 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803538 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803551 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqx4m\" (UniqueName: \"kubernetes.io/projected/63e3da86-ceaf-47ef-81af-07853efd035b-kube-api-access-wqx4m\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803563 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5thpm\" (UniqueName: \"kubernetes.io/projected/f208ab47-0aa9-4386-bd40-098dc2930be6-kube-api-access-5thpm\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803576 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f208ab47-0aa9-4386-bd40-098dc2930be6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.803589 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ee9ca5-b342-4011-8b36-d2d00314d515-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.804942 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities" (OuterVolumeSpecName: "utilities") pod "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" (UID: "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.806187 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ce5d4445-04fd-4c00-8b1e-393386cc78ad" (UID: "ce5d4445-04fd-4c00-8b1e-393386cc78ad"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.806254 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ce5d4445-04fd-4c00-8b1e-393386cc78ad" (UID: "ce5d4445-04fd-4c00-8b1e-393386cc78ad"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.807060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq" (OuterVolumeSpecName: "kube-api-access-89nkq") pod "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" (UID: "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6"). InnerVolumeSpecName "kube-api-access-89nkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: W0319 19:02:16.809466 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a8ebf9_7711_48a2_a0bd_eb497fd74770.slice/crio-a79271bd7c4c3fb4649daff9edaaeefb7f75d10a361a6e5a974da2464b4e73a2 WatchSource:0}: Error finding container a79271bd7c4c3fb4649daff9edaaeefb7f75d10a361a6e5a974da2464b4e73a2: Status 404 returned error can't find the container with id a79271bd7c4c3fb4649daff9edaaeefb7f75d10a361a6e5a974da2464b4e73a2 Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.810529 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm" (OuterVolumeSpecName: "kube-api-access-9nrxm") pod "ce5d4445-04fd-4c00-8b1e-393386cc78ad" (UID: "ce5d4445-04fd-4c00-8b1e-393386cc78ad"). InnerVolumeSpecName "kube-api-access-9nrxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.835546 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" (UID: "70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.881103 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xpkw6"] Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.899631 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63e3da86-ceaf-47ef-81af-07853efd035b" (UID: "63e3da86-ceaf-47ef-81af-07853efd035b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905014 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nrxm\" (UniqueName: \"kubernetes.io/projected/ce5d4445-04fd-4c00-8b1e-393386cc78ad-kube-api-access-9nrxm\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905045 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e3da86-ceaf-47ef-81af-07853efd035b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905059 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905073 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905085 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905095 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce5d4445-04fd-4c00-8b1e-393386cc78ad-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.905107 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89nkq\" (UniqueName: \"kubernetes.io/projected/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6-kube-api-access-89nkq\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.911520 5033 scope.go:117] "RemoveContainer" containerID="abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.911875 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5\": container with ID starting with abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5 not found: ID does not exist" containerID="abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.911916 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5"} err="failed to get container status \"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5\": rpc error: code = NotFound desc = could not find container \"abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5\": container with ID starting with abf71151ebed4506e667823043febeef63a37b015c8efd1971b94f73d056d2a5 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.911943 5033 scope.go:117] "RemoveContainer" containerID="f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.912238 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01\": container with ID starting with f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01 not found: ID does not exist" containerID="f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.912270 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01"} err="failed to get container status \"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01\": rpc error: code = NotFound desc = could not find container \"f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01\": container with ID starting with f60055efc4bd1a57d54206db1a121975f57fe4e3e71d3829962b621bb118ec01 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.912293 5033 scope.go:117] "RemoveContainer" containerID="063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.912614 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2\": container with ID starting with 063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2 not found: ID does not exist" containerID="063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.912657 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2"} err="failed to get container status \"063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2\": rpc error: code = NotFound desc = could not find container \"063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2\": container with ID starting with 063c5aa3f6204aef5044a90c236c10e39dc9d2d0fbc81f5d0065fde7f84d8ff2 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.912692 5033 scope.go:117] "RemoveContainer" containerID="6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.940230 5033 scope.go:117] "RemoveContainer" containerID="7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.969056 5033 scope.go:117] "RemoveContainer" containerID="3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.983518 5033 scope.go:117] "RemoveContainer" containerID="6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.983905 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1\": container with ID starting with 6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1 not found: ID does not exist" containerID="6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.983942 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1"} err="failed to get container status \"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1\": rpc error: code = NotFound desc = could not find container \"6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1\": container with ID starting with 6243a025db545b2a7906bedbe78b3808909ff0df76c46a1c4721772715a163e1 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.984442 5033 scope.go:117] "RemoveContainer" containerID="7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.985017 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244\": container with ID starting with 7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244 not found: ID does not exist" containerID="7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.985052 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244"} err="failed to get container status \"7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244\": rpc error: code = NotFound desc = could not find container \"7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244\": container with ID starting with 7017dcd72bb0bf07628ed32ef5084fca21f63e735956f3e2ccb87c10e88cd244 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.985077 5033 scope.go:117] "RemoveContainer" containerID="3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418" Mar 19 19:02:16 crc kubenswrapper[5033]: E0319 19:02:16.985468 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418\": container with ID starting with 3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418 not found: ID does not exist" containerID="3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.985496 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418"} err="failed to get container status \"3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418\": rpc error: code = NotFound desc = could not find container \"3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418\": container with ID starting with 3e32ae78ca83a9ede0357883e7dfd1b3376b245d12e6b7ae1907609ea0ed6418 not found: ID does not exist" Mar 19 19:02:16 crc kubenswrapper[5033]: I0319 19:02:16.985513 5033 scope.go:117] "RemoveContainer" containerID="c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.004841 5033 scope.go:117] "RemoveContainer" containerID="c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.027431 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.033955 5033 scope.go:117] "RemoveContainer" containerID="c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.035931 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca\": container with ID starting with c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca not found: ID does not exist" containerID="c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.036091 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca"} err="failed to get container status \"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca\": rpc error: code = NotFound desc = could not find container \"c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca\": container with ID starting with c519e87966f8be2dfc95c67e1793e95900614bc6fd4132d1c252514de7d7bfca not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.038110 5033 scope.go:117] "RemoveContainer" containerID="c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.038257 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dvj7"] Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.038594 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25\": container with ID starting with c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25 not found: ID does not exist" containerID="c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.038616 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25"} err="failed to get container status \"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25\": rpc error: code = NotFound desc = could not find container \"c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25\": container with ID starting with c8f54382b92966332e3add6e1fa3f328825699e5e8689ac442947a1d3b526f25 not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.038631 5033 scope.go:117] "RemoveContainer" containerID="1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.045520 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.058109 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2zhl"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.065216 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.070248 5033 scope.go:117] "RemoveContainer" containerID="1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.070733 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944\": container with ID starting with 1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944 not found: ID does not exist" containerID="1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.070780 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944"} err="failed to get container status \"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944\": rpc error: code = NotFound desc = could not find container \"1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944\": container with ID starting with 1751129c904dfe2cb3f99b08a486a3401793fae3be5aa0e998d3125dcbdf8944 not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.070836 5033 scope.go:117] "RemoveContainer" containerID="f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.072545 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ddsbp"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.075427 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.079769 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqvsx"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.081659 5033 scope.go:117] "RemoveContainer" containerID="e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.084824 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.087246 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb5d498c-7kgzx"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.093624 5033 scope.go:117] "RemoveContainer" containerID="37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.108816 5033 scope.go:117] "RemoveContainer" containerID="f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.109165 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2\": container with ID starting with f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2 not found: ID does not exist" containerID="f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.109191 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2"} err="failed to get container status \"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2\": rpc error: code = NotFound desc = could not find container \"f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2\": container with ID starting with f1912bf048f1d68f00002e3541a368a2ec0665957a8b86c450c3b5a55c0a2fd2 not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.109213 5033 scope.go:117] "RemoveContainer" containerID="e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.109536 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075\": container with ID starting with e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075 not found: ID does not exist" containerID="e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.109583 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075"} err="failed to get container status \"e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075\": rpc error: code = NotFound desc = could not find container \"e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075\": container with ID starting with e7d30dc6917c02f0f541d56c8328d77a63045a5a131e1588f41ee15a79476075 not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.109618 5033 scope.go:117] "RemoveContainer" containerID="37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.109940 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d\": container with ID starting with 37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d not found: ID does not exist" containerID="37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.109964 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d"} err="failed to get container status \"37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d\": rpc error: code = NotFound desc = could not find container \"37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d\": container with ID starting with 37b1dd98cc9cf648425238cbb4209ba9f20d6b02da949473b249afbf48e6518d not found: ID does not exist" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589212 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg"] Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589418 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589431 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589441 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589466 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589477 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589484 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589493 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f208ab47-0aa9-4386-bd40-098dc2930be6" containerName="route-controller-manager" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589500 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f208ab47-0aa9-4386-bd40-098dc2930be6" containerName="route-controller-manager" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589518 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589524 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589533 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589548 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589553 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589563 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589568 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589575 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589580 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589589 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589595 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589602 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589608 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589615 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589621 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589628 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589634 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="extract-content" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589641 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589648 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: E0319 19:02:17.589655 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589661 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="extract-utilities" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589739 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589750 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589760 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589768 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" containerName="marketplace-operator" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589775 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589782 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" containerName="registry-server" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.589788 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f208ab47-0aa9-4386-bd40-098dc2930be6" containerName="route-controller-manager" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.590108 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.592375 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.592700 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.592808 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.592749 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.592812 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.593031 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.601210 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg"] Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.726514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" event={"ID":"42a8ebf9-7711-48a2-a0bd-eb497fd74770","Type":"ContainerStarted","Data":"ee4b68d77ac659be5a14f62112ec39b8f13361b0f628ce5c3405a003acc8389b"} Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.726898 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.726934 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" event={"ID":"42a8ebf9-7711-48a2-a0bd-eb497fd74770","Type":"ContainerStarted","Data":"a79271bd7c4c3fb4649daff9edaaeefb7f75d10a361a6e5a974da2464b4e73a2"} Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.732393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" event={"ID":"3a73e1d1-a896-4aa7-bba9-c372ab716534","Type":"ContainerStarted","Data":"8afa29e4e8135cf3414c2d1c56519c7a86ddd461d14622fea5cd4f70c9707ed4"} Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.732429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" event={"ID":"3a73e1d1-a896-4aa7-bba9-c372ab716534","Type":"ContainerStarted","Data":"fb5218f5f02b0090157be7977f2208d3fd86e8e9a4cabcbe8d3e5b79a627a40a"} Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.733260 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.737637 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.743746 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" podStartSLOduration=2.7437264839999997 podStartE2EDuration="2.743726484s" podCreationTimestamp="2026-03-19 19:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:02:17.743437898 +0000 UTC m=+347.848467747" watchObservedRunningTime="2026-03-19 19:02:17.743726484 +0000 UTC m=+347.848756343" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.764644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-config\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.764715 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-client-ca\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.764737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7l4\" (UniqueName: \"kubernetes.io/projected/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-kube-api-access-lz7l4\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.764888 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-serving-cert\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.771885 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xpkw6" podStartSLOduration=2.771862711 podStartE2EDuration="2.771862711s" podCreationTimestamp="2026-03-19 19:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:02:17.767312521 +0000 UTC m=+347.872342400" watchObservedRunningTime="2026-03-19 19:02:17.771862711 +0000 UTC m=+347.876892560" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.865999 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-config\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.866065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-client-ca\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.866098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7l4\" (UniqueName: \"kubernetes.io/projected/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-kube-api-access-lz7l4\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.866152 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-serving-cert\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.868766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-client-ca\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.868777 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-config\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.872382 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-serving-cert\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.883793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7l4\" (UniqueName: \"kubernetes.io/projected/f7b10a5b-ec78-418e-a8cc-0378a0a89d00-kube-api-access-lz7l4\") pod \"route-controller-manager-7cb6456979-trgjg\" (UID: \"f7b10a5b-ec78-418e-a8cc-0378a0a89d00\") " pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:17 crc kubenswrapper[5033]: I0319 19:02:17.977290 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.028294 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdzw6"] Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.035050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.039287 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.048392 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdzw6"] Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.170908 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-utilities\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.171173 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqr8c\" (UniqueName: \"kubernetes.io/projected/64d10e25-ab07-42ba-90c0-4b57737633f7-kube-api-access-wqr8c\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.171214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-catalog-content\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.272312 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-catalog-content\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.272405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-utilities\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.272424 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqr8c\" (UniqueName: \"kubernetes.io/projected/64d10e25-ab07-42ba-90c0-4b57737633f7-kube-api-access-wqr8c\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.273381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-catalog-content\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.273611 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d10e25-ab07-42ba-90c0-4b57737633f7-utilities\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.293605 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqr8c\" (UniqueName: \"kubernetes.io/projected/64d10e25-ab07-42ba-90c0-4b57737633f7-kube-api-access-wqr8c\") pod \"redhat-marketplace-mdzw6\" (UID: \"64d10e25-ab07-42ba-90c0-4b57737633f7\") " pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.362540 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.379647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg"] Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.631951 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e3da86-ceaf-47ef-81af-07853efd035b" path="/var/lib/kubelet/pods/63e3da86-ceaf-47ef-81af-07853efd035b/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.633821 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6" path="/var/lib/kubelet/pods/70dbfc25-0b94-4c8f-89bb-af0d0bb8e2e6/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.634494 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab53fd0b-8294-4dd0-a434-ab7eaff0e360" path="/var/lib/kubelet/pods/ab53fd0b-8294-4dd0-a434-ab7eaff0e360/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.635524 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ee9ca5-b342-4011-8b36-d2d00314d515" path="/var/lib/kubelet/pods/c8ee9ca5-b342-4011-8b36-d2d00314d515/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.636083 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5d4445-04fd-4c00-8b1e-393386cc78ad" path="/var/lib/kubelet/pods/ce5d4445-04fd-4c00-8b1e-393386cc78ad/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.638406 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f208ab47-0aa9-4386-bd40-098dc2930be6" path="/var/lib/kubelet/pods/f208ab47-0aa9-4386-bd40-098dc2930be6/volumes" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.639085 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.640060 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.640310 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.641816 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.737595 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" event={"ID":"f7b10a5b-ec78-418e-a8cc-0378a0a89d00","Type":"ContainerStarted","Data":"2aaf46223f9a1dc122ac57a2a35cc95eb4b6bdd0edf062cf8db7625cb3ce6927"} Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.737640 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" event={"ID":"f7b10a5b-ec78-418e-a8cc-0378a0a89d00","Type":"ContainerStarted","Data":"73b22f763bc8c4fb9f975da988c13d7ed7b629c3641b9cc5ababd9125086dbc1"} Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.738469 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.782068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ktj\" (UniqueName: \"kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.782155 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.782182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.826408 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" podStartSLOduration=3.826387713 podStartE2EDuration="3.826387713s" podCreationTimestamp="2026-03-19 19:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:02:18.755838768 +0000 UTC m=+348.860868617" watchObservedRunningTime="2026-03-19 19:02:18.826387713 +0000 UTC m=+348.931417572" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.831252 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdzw6"] Mar 19 19:02:18 crc kubenswrapper[5033]: W0319 19:02:18.839885 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d10e25_ab07_42ba_90c0_4b57737633f7.slice/crio-28a62a08256827778f8f8117fe5a938099d3baa42ee1b8f83c8f0841c3e09868 WatchSource:0}: Error finding container 28a62a08256827778f8f8117fe5a938099d3baa42ee1b8f83c8f0841c3e09868: Status 404 returned error can't find the container with id 28a62a08256827778f8f8117fe5a938099d3baa42ee1b8f83c8f0841c3e09868 Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.882960 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ktj\" (UniqueName: \"kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.883088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.883144 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.883575 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.884206 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.911199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ktj\" (UniqueName: \"kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj\") pod \"certified-operators-5jd5t\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:18 crc kubenswrapper[5033]: I0319 19:02:18.957503 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.019789 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb6456979-trgjg" Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.369287 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:02:19 crc kubenswrapper[5033]: W0319 19:02:19.377538 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae96f2f_8565_4e27_a05b_c273e4859d74.slice/crio-c87cd7f40a3522acdaea750225bcad1570a4008c56fbf27051da0857e253bba8 WatchSource:0}: Error finding container c87cd7f40a3522acdaea750225bcad1570a4008c56fbf27051da0857e253bba8: Status 404 returned error can't find the container with id c87cd7f40a3522acdaea750225bcad1570a4008c56fbf27051da0857e253bba8 Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.745102 5033 generic.go:334] "Generic (PLEG): container finished" podID="64d10e25-ab07-42ba-90c0-4b57737633f7" containerID="406c747df882f91a39140ee31e4076acd8f2cf133c80c251cdde5b5bfa8a9a58" exitCode=0 Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.745155 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdzw6" event={"ID":"64d10e25-ab07-42ba-90c0-4b57737633f7","Type":"ContainerDied","Data":"406c747df882f91a39140ee31e4076acd8f2cf133c80c251cdde5b5bfa8a9a58"} Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.745207 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdzw6" event={"ID":"64d10e25-ab07-42ba-90c0-4b57737633f7","Type":"ContainerStarted","Data":"28a62a08256827778f8f8117fe5a938099d3baa42ee1b8f83c8f0841c3e09868"} Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.746716 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerID="f2e71ca4a03e3b27ba111ccc31b123ad929f18a288ee1626b8ac193467f3f95d" exitCode=0 Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.746744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerDied","Data":"f2e71ca4a03e3b27ba111ccc31b123ad929f18a288ee1626b8ac193467f3f95d"} Mar 19 19:02:19 crc kubenswrapper[5033]: I0319 19:02:19.746772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerStarted","Data":"c87cd7f40a3522acdaea750225bcad1570a4008c56fbf27051da0857e253bba8"} Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.439542 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b57dx"] Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.443359 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.444402 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b57dx"] Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.445895 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.604391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-catalog-content\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.604534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4w4q\" (UniqueName: \"kubernetes.io/projected/1686c372-2322-47b2-a8a9-ad674bd5bf0b-kube-api-access-s4w4q\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.604742 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-utilities\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.706314 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-catalog-content\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.706541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4w4q\" (UniqueName: \"kubernetes.io/projected/1686c372-2322-47b2-a8a9-ad674bd5bf0b-kube-api-access-s4w4q\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.706618 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-utilities\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.707274 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-catalog-content\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.707290 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1686c372-2322-47b2-a8a9-ad674bd5bf0b-utilities\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.727880 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4w4q\" (UniqueName: \"kubernetes.io/projected/1686c372-2322-47b2-a8a9-ad674bd5bf0b-kube-api-access-s4w4q\") pod \"redhat-operators-b57dx\" (UID: \"1686c372-2322-47b2-a8a9-ad674bd5bf0b\") " pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.756223 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdzw6" event={"ID":"64d10e25-ab07-42ba-90c0-4b57737633f7","Type":"ContainerStarted","Data":"b8b5cdd60d6b6d3b76f9d1295e5a11ac4261b6a75d896bf4fd8aeaed95efaba2"} Mar 19 19:02:20 crc kubenswrapper[5033]: I0319 19:02:20.776341 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.027682 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.029358 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.034592 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.037420 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.110061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcncm\" (UniqueName: \"kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.110110 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.110128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.211938 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcncm\" (UniqueName: \"kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.212046 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.212084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.212722 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.213412 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.235990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcncm\" (UniqueName: \"kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm\") pod \"community-operators-mrg7t\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.242724 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b57dx"] Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.383021 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.763410 5033 generic.go:334] "Generic (PLEG): container finished" podID="64d10e25-ab07-42ba-90c0-4b57737633f7" containerID="b8b5cdd60d6b6d3b76f9d1295e5a11ac4261b6a75d896bf4fd8aeaed95efaba2" exitCode=0 Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.763504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdzw6" event={"ID":"64d10e25-ab07-42ba-90c0-4b57737633f7","Type":"ContainerDied","Data":"b8b5cdd60d6b6d3b76f9d1295e5a11ac4261b6a75d896bf4fd8aeaed95efaba2"} Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.767420 5033 generic.go:334] "Generic (PLEG): container finished" podID="1686c372-2322-47b2-a8a9-ad674bd5bf0b" containerID="17522f6216b29c04766ad6f1bcb6cff70df14e86e53eec8fce03a97eb44726c5" exitCode=0 Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.767487 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b57dx" event={"ID":"1686c372-2322-47b2-a8a9-ad674bd5bf0b","Type":"ContainerDied","Data":"17522f6216b29c04766ad6f1bcb6cff70df14e86e53eec8fce03a97eb44726c5"} Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.767515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b57dx" event={"ID":"1686c372-2322-47b2-a8a9-ad674bd5bf0b","Type":"ContainerStarted","Data":"f00d059e04c0fceeb57dd871ea77aefdd6fa6b35f0896344a02e2a945ea3bc56"} Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.769684 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerID="6b9fe82c963ba5333b073be36284bc19244d2958a0f421a2001876d924418039" exitCode=0 Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.769737 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerDied","Data":"6b9fe82c963ba5333b073be36284bc19244d2958a0f421a2001876d924418039"} Mar 19 19:02:21 crc kubenswrapper[5033]: I0319 19:02:21.783173 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 19:02:21 crc kubenswrapper[5033]: W0319 19:02:21.792164 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ec342c_bcf6_4ec4_838b_f0af17eb5aac.slice/crio-59543798e46a3b207e760d13bc5d5eaf2989e1700b49d394e04084f0adbdff0f WatchSource:0}: Error finding container 59543798e46a3b207e760d13bc5d5eaf2989e1700b49d394e04084f0adbdff0f: Status 404 returned error can't find the container with id 59543798e46a3b207e760d13bc5d5eaf2989e1700b49d394e04084f0adbdff0f Mar 19 19:02:22 crc kubenswrapper[5033]: I0319 19:02:22.789978 5033 generic.go:334] "Generic (PLEG): container finished" podID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerID="46894d7a23efff17aeb7836e73004de72438e6bf40b599ddc766813e7324691a" exitCode=0 Mar 19 19:02:22 crc kubenswrapper[5033]: I0319 19:02:22.790078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerDied","Data":"46894d7a23efff17aeb7836e73004de72438e6bf40b599ddc766813e7324691a"} Mar 19 19:02:22 crc kubenswrapper[5033]: I0319 19:02:22.790126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerStarted","Data":"59543798e46a3b207e760d13bc5d5eaf2989e1700b49d394e04084f0adbdff0f"} Mar 19 19:02:23 crc kubenswrapper[5033]: I0319 19:02:23.796335 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdzw6" event={"ID":"64d10e25-ab07-42ba-90c0-4b57737633f7","Type":"ContainerStarted","Data":"6dedefdec91b62752d125018965fa06e0accefb26e79b5d67d5d67ee033c21c4"} Mar 19 19:02:23 crc kubenswrapper[5033]: I0319 19:02:23.798082 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b57dx" event={"ID":"1686c372-2322-47b2-a8a9-ad674bd5bf0b","Type":"ContainerStarted","Data":"913aabdaf9751165d84972b2f42ac8b7fd8cde1ebafcc39216501389ea8f4c6b"} Mar 19 19:02:23 crc kubenswrapper[5033]: I0319 19:02:23.800330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerStarted","Data":"bcfa8c2282c288d7eac3c600fb5994c8cff8bc1719c18bed883e208ef8e7f5a0"} Mar 19 19:02:23 crc kubenswrapper[5033]: I0319 19:02:23.816736 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdzw6" podStartSLOduration=2.486388777 podStartE2EDuration="5.81672197s" podCreationTimestamp="2026-03-19 19:02:18 +0000 UTC" firstStartedPulling="2026-03-19 19:02:19.749144018 +0000 UTC m=+349.854173867" lastFinishedPulling="2026-03-19 19:02:23.079477211 +0000 UTC m=+353.184507060" observedRunningTime="2026-03-19 19:02:23.813597658 +0000 UTC m=+353.918627517" watchObservedRunningTime="2026-03-19 19:02:23.81672197 +0000 UTC m=+353.921751819" Mar 19 19:02:23 crc kubenswrapper[5033]: I0319 19:02:23.833368 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jd5t" podStartSLOduration=2.153454543 podStartE2EDuration="5.833351768s" podCreationTimestamp="2026-03-19 19:02:18 +0000 UTC" firstStartedPulling="2026-03-19 19:02:19.747923304 +0000 UTC m=+349.852953153" lastFinishedPulling="2026-03-19 19:02:23.427820529 +0000 UTC m=+353.532850378" observedRunningTime="2026-03-19 19:02:23.830342479 +0000 UTC m=+353.935372338" watchObservedRunningTime="2026-03-19 19:02:23.833351768 +0000 UTC m=+353.938381617" Mar 19 19:02:24 crc kubenswrapper[5033]: E0319 19:02:24.144231 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ec342c_bcf6_4ec4_838b_f0af17eb5aac.slice/crio-3802fac55be63f703d118c253d75d76a1b8c1e910b72baea680f5548baaa212f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ec342c_bcf6_4ec4_838b_f0af17eb5aac.slice/crio-conmon-3802fac55be63f703d118c253d75d76a1b8c1e910b72baea680f5548baaa212f.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:02:24 crc kubenswrapper[5033]: I0319 19:02:24.806591 5033 generic.go:334] "Generic (PLEG): container finished" podID="1686c372-2322-47b2-a8a9-ad674bd5bf0b" containerID="913aabdaf9751165d84972b2f42ac8b7fd8cde1ebafcc39216501389ea8f4c6b" exitCode=0 Mar 19 19:02:24 crc kubenswrapper[5033]: I0319 19:02:24.806920 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b57dx" event={"ID":"1686c372-2322-47b2-a8a9-ad674bd5bf0b","Type":"ContainerDied","Data":"913aabdaf9751165d84972b2f42ac8b7fd8cde1ebafcc39216501389ea8f4c6b"} Mar 19 19:02:24 crc kubenswrapper[5033]: I0319 19:02:24.813093 5033 generic.go:334] "Generic (PLEG): container finished" podID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerID="3802fac55be63f703d118c253d75d76a1b8c1e910b72baea680f5548baaa212f" exitCode=0 Mar 19 19:02:24 crc kubenswrapper[5033]: I0319 19:02:24.813211 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerDied","Data":"3802fac55be63f703d118c253d75d76a1b8c1e910b72baea680f5548baaa212f"} Mar 19 19:02:25 crc kubenswrapper[5033]: I0319 19:02:25.821113 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b57dx" event={"ID":"1686c372-2322-47b2-a8a9-ad674bd5bf0b","Type":"ContainerStarted","Data":"e4e9d0d91dcd655f503747cc77ef7fd1997b0837480491776e6962678ac5e2e5"} Mar 19 19:02:25 crc kubenswrapper[5033]: I0319 19:02:25.823789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerStarted","Data":"9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1"} Mar 19 19:02:25 crc kubenswrapper[5033]: I0319 19:02:25.864617 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b57dx" podStartSLOduration=2.418034912 podStartE2EDuration="5.864600344s" podCreationTimestamp="2026-03-19 19:02:20 +0000 UTC" firstStartedPulling="2026-03-19 19:02:21.768719263 +0000 UTC m=+351.873749112" lastFinishedPulling="2026-03-19 19:02:25.215284695 +0000 UTC m=+355.320314544" observedRunningTime="2026-03-19 19:02:25.84363367 +0000 UTC m=+355.948663519" watchObservedRunningTime="2026-03-19 19:02:25.864600344 +0000 UTC m=+355.969630193" Mar 19 19:02:25 crc kubenswrapper[5033]: I0319 19:02:25.867011 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mrg7t" podStartSLOduration=2.352063222 podStartE2EDuration="4.867005852s" podCreationTimestamp="2026-03-19 19:02:21 +0000 UTC" firstStartedPulling="2026-03-19 19:02:22.794078778 +0000 UTC m=+352.899108657" lastFinishedPulling="2026-03-19 19:02:25.309021438 +0000 UTC m=+355.414051287" observedRunningTime="2026-03-19 19:02:25.863974392 +0000 UTC m=+355.969004241" watchObservedRunningTime="2026-03-19 19:02:25.867005852 +0000 UTC m=+355.972035691" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.363317 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.363668 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.402396 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.884031 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdzw6" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.957844 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:28 crc kubenswrapper[5033]: I0319 19:02:28.957905 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:29 crc kubenswrapper[5033]: I0319 19:02:29.016082 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:29 crc kubenswrapper[5033]: I0319 19:02:29.900707 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:02:30 crc kubenswrapper[5033]: I0319 19:02:30.776684 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:30 crc kubenswrapper[5033]: I0319 19:02:30.776727 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:31 crc kubenswrapper[5033]: I0319 19:02:31.384263 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:31 crc kubenswrapper[5033]: I0319 19:02:31.384330 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:31 crc kubenswrapper[5033]: I0319 19:02:31.429036 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:31 crc kubenswrapper[5033]: I0319 19:02:31.814131 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b57dx" podUID="1686c372-2322-47b2-a8a9-ad674bd5bf0b" containerName="registry-server" probeResult="failure" output=< Mar 19 19:02:31 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:02:31 crc kubenswrapper[5033]: > Mar 19 19:02:31 crc kubenswrapper[5033]: I0319 19:02:31.897629 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 19:02:36 crc kubenswrapper[5033]: I0319 19:02:36.001254 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2sckc" Mar 19 19:02:36 crc kubenswrapper[5033]: I0319 19:02:36.051095 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 19:02:40 crc kubenswrapper[5033]: I0319 19:02:40.810633 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:02:40 crc kubenswrapper[5033]: I0319 19:02:40.880891 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b57dx" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.093769 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" podUID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" containerName="registry" containerID="cri-o://85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60" gracePeriod=30 Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.478878 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543285 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543340 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543368 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543654 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543714 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543752 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543779 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2hjb\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.543830 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca\") pod \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\" (UID: \"1de9caaa-c912-48f2-9306-5cc7768fc8b3\") " Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.544780 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.544826 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.556154 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb" (OuterVolumeSpecName: "kube-api-access-x2hjb") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "kube-api-access-x2hjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.557787 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.557968 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.558399 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.563668 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.567617 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1de9caaa-c912-48f2-9306-5cc7768fc8b3" (UID: "1de9caaa-c912-48f2-9306-5cc7768fc8b3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645736 5033 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1de9caaa-c912-48f2-9306-5cc7768fc8b3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645763 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2hjb\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-kube-api-access-x2hjb\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645776 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645788 5033 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645799 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1de9caaa-c912-48f2-9306-5cc7768fc8b3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645812 5033 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1de9caaa-c912-48f2-9306-5cc7768fc8b3-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:01 crc kubenswrapper[5033]: I0319 19:03:01.645823 5033 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1de9caaa-c912-48f2-9306-5cc7768fc8b3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.019847 5033 generic.go:334] "Generic (PLEG): container finished" podID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" containerID="85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60" exitCode=0 Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.020144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" event={"ID":"1de9caaa-c912-48f2-9306-5cc7768fc8b3","Type":"ContainerDied","Data":"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60"} Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.020169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" event={"ID":"1de9caaa-c912-48f2-9306-5cc7768fc8b3","Type":"ContainerDied","Data":"edb4157b1d5b61ab0994b9a1a7b49b4942b9ec1d4f3ae87fe0a5a764ff12bf61"} Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.020184 5033 scope.go:117] "RemoveContainer" containerID="85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60" Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.020289 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sl57l" Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.035670 5033 scope.go:117] "RemoveContainer" containerID="85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60" Mar 19 19:03:02 crc kubenswrapper[5033]: E0319 19:03:02.036029 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60\": container with ID starting with 85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60 not found: ID does not exist" containerID="85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60" Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.036059 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60"} err="failed to get container status \"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60\": rpc error: code = NotFound desc = could not find container \"85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60\": container with ID starting with 85e1513c4f19b647981af4ac3e822fb677e216dc8af43df744887636a0bc9d60 not found: ID does not exist" Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.045235 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.048713 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sl57l"] Mar 19 19:03:02 crc kubenswrapper[5033]: I0319 19:03:02.633216 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" path="/var/lib/kubelet/pods/1de9caaa-c912-48f2-9306-5cc7768fc8b3/volumes" Mar 19 19:03:40 crc kubenswrapper[5033]: I0319 19:03:40.758949 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:03:40 crc kubenswrapper[5033]: I0319 19:03:40.759385 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.123738 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565784-n2knf"] Mar 19 19:04:00 crc kubenswrapper[5033]: E0319 19:04:00.124396 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" containerName="registry" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.124409 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" containerName="registry" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.124517 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de9caaa-c912-48f2-9306-5cc7768fc8b3" containerName="registry" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.124858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.128149 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.129358 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.129355 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.148769 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-n2knf"] Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.151367 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4rf\" (UniqueName: \"kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf\") pod \"auto-csr-approver-29565784-n2knf\" (UID: \"989558ba-c11c-4f21-8354-aa0d39e841f1\") " pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.252967 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4rf\" (UniqueName: \"kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf\") pod \"auto-csr-approver-29565784-n2knf\" (UID: \"989558ba-c11c-4f21-8354-aa0d39e841f1\") " pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.273388 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4rf\" (UniqueName: \"kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf\") pod \"auto-csr-approver-29565784-n2knf\" (UID: \"989558ba-c11c-4f21-8354-aa0d39e841f1\") " pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.453152 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.871429 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-n2knf"] Mar 19 19:04:00 crc kubenswrapper[5033]: I0319 19:04:00.880721 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:04:01 crc kubenswrapper[5033]: I0319 19:04:01.400762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-n2knf" event={"ID":"989558ba-c11c-4f21-8354-aa0d39e841f1","Type":"ContainerStarted","Data":"02cff562a41af7c08770f4505189e82e1842cc67729c22f79a561807b2f902ac"} Mar 19 19:04:02 crc kubenswrapper[5033]: I0319 19:04:02.406882 5033 generic.go:334] "Generic (PLEG): container finished" podID="989558ba-c11c-4f21-8354-aa0d39e841f1" containerID="1465a9be8a5f6497878a78076b336a2b606a5a9d7ff442938e8f60aee0bd92a4" exitCode=0 Mar 19 19:04:02 crc kubenswrapper[5033]: I0319 19:04:02.406973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-n2knf" event={"ID":"989558ba-c11c-4f21-8354-aa0d39e841f1","Type":"ContainerDied","Data":"1465a9be8a5f6497878a78076b336a2b606a5a9d7ff442938e8f60aee0bd92a4"} Mar 19 19:04:03 crc kubenswrapper[5033]: I0319 19:04:03.753672 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:03 crc kubenswrapper[5033]: I0319 19:04:03.910784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt4rf\" (UniqueName: \"kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf\") pod \"989558ba-c11c-4f21-8354-aa0d39e841f1\" (UID: \"989558ba-c11c-4f21-8354-aa0d39e841f1\") " Mar 19 19:04:03 crc kubenswrapper[5033]: I0319 19:04:03.916917 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf" (OuterVolumeSpecName: "kube-api-access-xt4rf") pod "989558ba-c11c-4f21-8354-aa0d39e841f1" (UID: "989558ba-c11c-4f21-8354-aa0d39e841f1"). InnerVolumeSpecName "kube-api-access-xt4rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.012190 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt4rf\" (UniqueName: \"kubernetes.io/projected/989558ba-c11c-4f21-8354-aa0d39e841f1-kube-api-access-xt4rf\") on node \"crc\" DevicePath \"\"" Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.425785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-n2knf" event={"ID":"989558ba-c11c-4f21-8354-aa0d39e841f1","Type":"ContainerDied","Data":"02cff562a41af7c08770f4505189e82e1842cc67729c22f79a561807b2f902ac"} Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.425821 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cff562a41af7c08770f4505189e82e1842cc67729c22f79a561807b2f902ac" Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.425842 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-n2knf" Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.817715 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-gls8c"] Mar 19 19:04:04 crc kubenswrapper[5033]: I0319 19:04:04.823977 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-gls8c"] Mar 19 19:04:06 crc kubenswrapper[5033]: I0319 19:04:06.631528 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cd9325-9740-4a70-98a7-3de9ebb30035" path="/var/lib/kubelet/pods/05cd9325-9740-4a70-98a7-3de9ebb30035/volumes" Mar 19 19:04:10 crc kubenswrapper[5033]: I0319 19:04:10.759289 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:04:10 crc kubenswrapper[5033]: I0319 19:04:10.759639 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:04:40 crc kubenswrapper[5033]: I0319 19:04:40.758985 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:04:40 crc kubenswrapper[5033]: I0319 19:04:40.759804 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:04:40 crc kubenswrapper[5033]: I0319 19:04:40.759876 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:04:40 crc kubenswrapper[5033]: I0319 19:04:40.760820 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:04:40 crc kubenswrapper[5033]: I0319 19:04:40.760926 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3" gracePeriod=600 Mar 19 19:04:41 crc kubenswrapper[5033]: I0319 19:04:41.667472 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3" exitCode=0 Mar 19 19:04:41 crc kubenswrapper[5033]: I0319 19:04:41.667563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3"} Mar 19 19:04:41 crc kubenswrapper[5033]: I0319 19:04:41.667945 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc"} Mar 19 19:04:41 crc kubenswrapper[5033]: I0319 19:04:41.667972 5033 scope.go:117] "RemoveContainer" containerID="a71e554eede115b1dbad5dcab081e28172f883b64464bfe0cc7c7792b9e6f208" Mar 19 19:05:31 crc kubenswrapper[5033]: I0319 19:05:31.001052 5033 scope.go:117] "RemoveContainer" containerID="7920ff1eb303c17aa123a03056d83eb05368af06b8d4fbc2b64b3a373ec7144a" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.143943 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565786-njps6"] Mar 19 19:06:00 crc kubenswrapper[5033]: E0319 19:06:00.145112 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989558ba-c11c-4f21-8354-aa0d39e841f1" containerName="oc" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.145141 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="989558ba-c11c-4f21-8354-aa0d39e841f1" containerName="oc" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.145407 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="989558ba-c11c-4f21-8354-aa0d39e841f1" containerName="oc" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.146162 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.150158 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.150478 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.150681 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.156314 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-njps6"] Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.244966 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94m8\" (UniqueName: \"kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8\") pod \"auto-csr-approver-29565786-njps6\" (UID: \"f5823e80-5bf6-4248-b034-01d39e46d318\") " pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.347444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94m8\" (UniqueName: \"kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8\") pod \"auto-csr-approver-29565786-njps6\" (UID: \"f5823e80-5bf6-4248-b034-01d39e46d318\") " pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.373477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94m8\" (UniqueName: \"kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8\") pod \"auto-csr-approver-29565786-njps6\" (UID: \"f5823e80-5bf6-4248-b034-01d39e46d318\") " pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.480771 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:00 crc kubenswrapper[5033]: I0319 19:06:00.682617 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-njps6"] Mar 19 19:06:01 crc kubenswrapper[5033]: I0319 19:06:01.161744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-njps6" event={"ID":"f5823e80-5bf6-4248-b034-01d39e46d318","Type":"ContainerStarted","Data":"6d1ff57e49467081c07984c7f09e8e80a6e1e26e4e2b8abc9372a5bfc7e89675"} Mar 19 19:06:02 crc kubenswrapper[5033]: I0319 19:06:02.171293 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-njps6" event={"ID":"f5823e80-5bf6-4248-b034-01d39e46d318","Type":"ContainerStarted","Data":"530d4216967b7b9f17736b717db343b3048ea0a8aadb79c76916c810a64c439f"} Mar 19 19:06:02 crc kubenswrapper[5033]: I0319 19:06:02.200152 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565786-njps6" podStartSLOduration=1.118930421 podStartE2EDuration="2.200127815s" podCreationTimestamp="2026-03-19 19:06:00 +0000 UTC" firstStartedPulling="2026-03-19 19:06:00.691627003 +0000 UTC m=+570.796656852" lastFinishedPulling="2026-03-19 19:06:01.772824397 +0000 UTC m=+571.877854246" observedRunningTime="2026-03-19 19:06:02.195523688 +0000 UTC m=+572.300553577" watchObservedRunningTime="2026-03-19 19:06:02.200127815 +0000 UTC m=+572.305157704" Mar 19 19:06:03 crc kubenswrapper[5033]: I0319 19:06:03.180682 5033 generic.go:334] "Generic (PLEG): container finished" podID="f5823e80-5bf6-4248-b034-01d39e46d318" containerID="530d4216967b7b9f17736b717db343b3048ea0a8aadb79c76916c810a64c439f" exitCode=0 Mar 19 19:06:03 crc kubenswrapper[5033]: I0319 19:06:03.180783 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-njps6" event={"ID":"f5823e80-5bf6-4248-b034-01d39e46d318","Type":"ContainerDied","Data":"530d4216967b7b9f17736b717db343b3048ea0a8aadb79c76916c810a64c439f"} Mar 19 19:06:04 crc kubenswrapper[5033]: I0319 19:06:04.470075 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:04 crc kubenswrapper[5033]: I0319 19:06:04.506024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f94m8\" (UniqueName: \"kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8\") pod \"f5823e80-5bf6-4248-b034-01d39e46d318\" (UID: \"f5823e80-5bf6-4248-b034-01d39e46d318\") " Mar 19 19:06:04 crc kubenswrapper[5033]: I0319 19:06:04.513867 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8" (OuterVolumeSpecName: "kube-api-access-f94m8") pod "f5823e80-5bf6-4248-b034-01d39e46d318" (UID: "f5823e80-5bf6-4248-b034-01d39e46d318"). InnerVolumeSpecName "kube-api-access-f94m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:06:04 crc kubenswrapper[5033]: I0319 19:06:04.607811 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f94m8\" (UniqueName: \"kubernetes.io/projected/f5823e80-5bf6-4248-b034-01d39e46d318-kube-api-access-f94m8\") on node \"crc\" DevicePath \"\"" Mar 19 19:06:05 crc kubenswrapper[5033]: I0319 19:06:05.198657 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-njps6" event={"ID":"f5823e80-5bf6-4248-b034-01d39e46d318","Type":"ContainerDied","Data":"6d1ff57e49467081c07984c7f09e8e80a6e1e26e4e2b8abc9372a5bfc7e89675"} Mar 19 19:06:05 crc kubenswrapper[5033]: I0319 19:06:05.198719 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1ff57e49467081c07984c7f09e8e80a6e1e26e4e2b8abc9372a5bfc7e89675" Mar 19 19:06:05 crc kubenswrapper[5033]: I0319 19:06:05.198803 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-njps6" Mar 19 19:06:05 crc kubenswrapper[5033]: I0319 19:06:05.264111 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-g2zh9"] Mar 19 19:06:05 crc kubenswrapper[5033]: I0319 19:06:05.271203 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-g2zh9"] Mar 19 19:06:06 crc kubenswrapper[5033]: I0319 19:06:06.629306 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d7d36a-f1c2-4969-8368-75376b0c2197" path="/var/lib/kubelet/pods/e0d7d36a-f1c2-4969-8368-75376b0c2197/volumes" Mar 19 19:07:10 crc kubenswrapper[5033]: I0319 19:07:10.759504 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:07:10 crc kubenswrapper[5033]: I0319 19:07:10.760396 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.327664 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8"] Mar 19 19:07:15 crc kubenswrapper[5033]: E0319 19:07:15.328233 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5823e80-5bf6-4248-b034-01d39e46d318" containerName="oc" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.328252 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5823e80-5bf6-4248-b034-01d39e46d318" containerName="oc" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.328521 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5823e80-5bf6-4248-b034-01d39e46d318" containerName="oc" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.329597 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.332736 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.339748 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8"] Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.382012 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqhf\" (UniqueName: \"kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.382093 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.382203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.483282 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqhf\" (UniqueName: \"kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.483375 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.483413 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.484154 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.484187 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.508997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqhf\" (UniqueName: \"kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:15 crc kubenswrapper[5033]: I0319 19:07:15.650243 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:16 crc kubenswrapper[5033]: I0319 19:07:16.031736 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8"] Mar 19 19:07:16 crc kubenswrapper[5033]: I0319 19:07:16.645896 5033 generic.go:334] "Generic (PLEG): container finished" podID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerID="f4d3263ce5e23df084257b44ba0d82d71e4e3b70fadef2349662cc9f4d934fcb" exitCode=0 Mar 19 19:07:16 crc kubenswrapper[5033]: I0319 19:07:16.645968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" event={"ID":"f7526813-a7dc-4074-a7ea-f791760e3cb0","Type":"ContainerDied","Data":"f4d3263ce5e23df084257b44ba0d82d71e4e3b70fadef2349662cc9f4d934fcb"} Mar 19 19:07:16 crc kubenswrapper[5033]: I0319 19:07:16.646019 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" event={"ID":"f7526813-a7dc-4074-a7ea-f791760e3cb0","Type":"ContainerStarted","Data":"25a5abcb4471bda51f7aac756dabf9c84f7743db88b792d8559cb312af9ba19f"} Mar 19 19:07:24 crc kubenswrapper[5033]: I0319 19:07:24.693752 5033 generic.go:334] "Generic (PLEG): container finished" podID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerID="b82f350c3a4202b07f7d5b4b2d5df0b8eba467486c9ac632f8a4cfa40683e3c8" exitCode=0 Mar 19 19:07:24 crc kubenswrapper[5033]: I0319 19:07:24.693847 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" event={"ID":"f7526813-a7dc-4074-a7ea-f791760e3cb0","Type":"ContainerDied","Data":"b82f350c3a4202b07f7d5b4b2d5df0b8eba467486c9ac632f8a4cfa40683e3c8"} Mar 19 19:07:25 crc kubenswrapper[5033]: I0319 19:07:25.707132 5033 generic.go:334] "Generic (PLEG): container finished" podID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerID="9e3abdd06a7e9f5d1461fd6c5fd206881946a1b8db93859ea23660993ccbaba6" exitCode=0 Mar 19 19:07:25 crc kubenswrapper[5033]: I0319 19:07:25.707182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" event={"ID":"f7526813-a7dc-4074-a7ea-f791760e3cb0","Type":"ContainerDied","Data":"9e3abdd06a7e9f5d1461fd6c5fd206881946a1b8db93859ea23660993ccbaba6"} Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.602659 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bk4w2"] Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603004 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-controller" containerID="cri-o://c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603061 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="nbdb" containerID="cri-o://5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603131 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="northd" containerID="cri-o://f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603164 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-node" containerID="cri-o://f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603185 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-acl-logging" containerID="cri-o://13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603206 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.603131 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="sbdb" containerID="cri-o://987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.644324 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovnkube-controller" containerID="cri-o://acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" gracePeriod=30 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.717364 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5qfn_4a7b8904-0121-4d6c-849e-1ebfa3af0c61/kube-multus/0.log" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.717414 5033 generic.go:334] "Generic (PLEG): container finished" podID="4a7b8904-0121-4d6c-849e-1ebfa3af0c61" containerID="e905b7626a19de4e8ef2773614c8f4558ce6031d89279d6de3f9c845f392f98f" exitCode=2 Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.717498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5qfn" event={"ID":"4a7b8904-0121-4d6c-849e-1ebfa3af0c61","Type":"ContainerDied","Data":"e905b7626a19de4e8ef2773614c8f4558ce6031d89279d6de3f9c845f392f98f"} Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.718066 5033 scope.go:117] "RemoveContainer" containerID="e905b7626a19de4e8ef2773614c8f4558ce6031d89279d6de3f9c845f392f98f" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.896514 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.902044 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bk4w2_cb7906f1-92ba-45a7-9a54-82a77c8e3e66/ovn-acl-logging/0.log" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.902621 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bk4w2_cb7906f1-92ba-45a7-9a54-82a77c8e3e66/ovn-controller/0.log" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.902968 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.966835 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2rkj"] Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967083 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="pull" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967105 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="pull" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967117 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967126 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967141 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="nbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967149 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="nbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967160 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="util" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967166 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="util" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967176 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967185 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967196 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovnkube-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967204 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovnkube-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967213 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-acl-logging" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967222 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-acl-logging" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967235 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="northd" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967242 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="northd" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967257 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="sbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967264 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="sbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967274 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="extract" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967282 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="extract" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967294 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kubecfg-setup" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967302 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kubecfg-setup" Mar 19 19:07:26 crc kubenswrapper[5033]: E0319 19:07:26.967311 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-node" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967318 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-node" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967422 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="sbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967462 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967471 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovn-acl-logging" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967481 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="nbdb" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967492 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-node" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967504 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967512 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="northd" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967521 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerName="ovnkube-controller" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.967533 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7526813-a7dc-4074-a7ea-f791760e3cb0" containerName="extract" Mar 19 19:07:26 crc kubenswrapper[5033]: I0319 19:07:26.969635 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.017932 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.017973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.017996 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018002 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018032 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018057 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018058 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018090 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018075 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018276 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018383 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018673 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018714 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018745 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dqhf\" (UniqueName: \"kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf\") pod \"f7526813-a7dc-4074-a7ea-f791760e3cb0\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018765 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018778 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018783 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018801 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018785 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket" (OuterVolumeSpecName: "log-socket") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018810 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018832 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018851 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018858 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle\") pod \"f7526813-a7dc-4074-a7ea-f791760e3cb0\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018879 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018879 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash" (OuterVolumeSpecName: "host-slash") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018901 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log" (OuterVolumeSpecName: "node-log") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018912 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.018935 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019002 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util\") pod \"f7526813-a7dc-4074-a7ea-f791760e3cb0\" (UID: \"f7526813-a7dc-4074-a7ea-f791760e3cb0\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019185 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019288 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019372 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019405 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmnzc\" (UniqueName: \"kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019862 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019907 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.019943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config\") pod \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\" (UID: \"cb7906f1-92ba-45a7-9a54-82a77c8e3e66\") " Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020091 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-systemd-units\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020131 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-node-log\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020160 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-bin\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-systemd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-config\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020311 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-env-overrides\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020354 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-ovn\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020376 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-script-lib\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-etc-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020434 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b37811ed-2e89-4203-b39e-496c175d263b-ovn-node-metrics-cert\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020532 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-netns\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-slash\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020572 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020612 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-kubelet\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020632 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020656 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-var-lib-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkp56\" (UniqueName: \"kubernetes.io/projected/b37811ed-2e89-4203-b39e-496c175d263b-kube-api-access-mkp56\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020788 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-netd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020821 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-log-socket\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020915 5033 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020930 5033 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020941 5033 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020950 5033 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020958 5033 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020966 5033 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020974 5033 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020982 5033 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.020990 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021000 5033 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021009 5033 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021017 5033 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021025 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021033 5033 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021102 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021110 5033 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021119 5033 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.021006 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle" (OuterVolumeSpecName: "bundle") pod "f7526813-a7dc-4074-a7ea-f791760e3cb0" (UID: "f7526813-a7dc-4074-a7ea-f791760e3cb0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.024198 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf" (OuterVolumeSpecName: "kube-api-access-8dqhf") pod "f7526813-a7dc-4074-a7ea-f791760e3cb0" (UID: "f7526813-a7dc-4074-a7ea-f791760e3cb0"). InnerVolumeSpecName "kube-api-access-8dqhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.024888 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc" (OuterVolumeSpecName: "kube-api-access-qmnzc") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "kube-api-access-qmnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.024931 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.029758 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util" (OuterVolumeSpecName: "util") pod "f7526813-a7dc-4074-a7ea-f791760e3cb0" (UID: "f7526813-a7dc-4074-a7ea-f791760e3cb0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.033908 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "cb7906f1-92ba-45a7-9a54-82a77c8e3e66" (UID: "cb7906f1-92ba-45a7-9a54-82a77c8e3e66"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.121925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-node-log\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122203 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-bin\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122291 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-bin\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122299 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-systemd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-node-log\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122392 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-config\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122423 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-env-overrides\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122462 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-ovn\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122487 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-script-lib\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-etc-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b37811ed-2e89-4203-b39e-496c175d263b-ovn-node-metrics-cert\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122559 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-netns\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122595 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-slash\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122652 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-kubelet\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-ovn\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122711 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-slash\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122670 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-etc-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122760 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-var-lib-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122759 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-kubelet\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-run-netns\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-var-lib-openvswitch\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122851 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122886 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkp56\" (UniqueName: \"kubernetes.io/projected/b37811ed-2e89-4203-b39e-496c175d263b-kube-api-access-mkp56\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.122977 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-netd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-log-socket\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123069 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-config\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-systemd-units\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123110 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-log-socket\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123091 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-host-cni-netd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123119 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-systemd-units\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123225 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dqhf\" (UniqueName: \"kubernetes.io/projected/f7526813-a7dc-4074-a7ea-f791760e3cb0-kube-api-access-8dqhf\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123247 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123265 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7526813-a7dc-4074-a7ea-f791760e3cb0-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123283 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmnzc\" (UniqueName: \"kubernetes.io/projected/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-kube-api-access-qmnzc\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123302 5033 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123318 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb7906f1-92ba-45a7-9a54-82a77c8e3e66-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123264 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-env-overrides\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b37811ed-2e89-4203-b39e-496c175d263b-run-systemd\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.123849 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b37811ed-2e89-4203-b39e-496c175d263b-ovnkube-script-lib\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.127763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b37811ed-2e89-4203-b39e-496c175d263b-ovn-node-metrics-cert\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.153146 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkp56\" (UniqueName: \"kubernetes.io/projected/b37811ed-2e89-4203-b39e-496c175d263b-kube-api-access-mkp56\") pod \"ovnkube-node-c2rkj\" (UID: \"b37811ed-2e89-4203-b39e-496c175d263b\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.286109 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:27 crc kubenswrapper[5033]: W0319 19:07:27.307661 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37811ed_2e89_4203_b39e_496c175d263b.slice/crio-04cfb86d18adb926759a96db2da690b16ff430220eec8301ce4ba1b907caad81 WatchSource:0}: Error finding container 04cfb86d18adb926759a96db2da690b16ff430220eec8301ce4ba1b907caad81: Status 404 returned error can't find the container with id 04cfb86d18adb926759a96db2da690b16ff430220eec8301ce4ba1b907caad81 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.725470 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bk4w2_cb7906f1-92ba-45a7-9a54-82a77c8e3e66/ovn-acl-logging/0.log" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.726726 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bk4w2_cb7906f1-92ba-45a7-9a54-82a77c8e3e66/ovn-controller/0.log" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727080 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727108 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727115 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727123 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727130 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727137 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727144 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" exitCode=143 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727151 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" exitCode=143 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727218 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727223 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727250 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727263 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727282 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727286 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727446 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727494 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727502 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727543 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727551 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727556 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727561 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727589 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727595 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727600 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727605 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727610 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727627 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727634 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727639 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727645 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727649 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727654 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727659 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727665 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727670 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727677 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bk4w2" event={"ID":"cb7906f1-92ba-45a7-9a54-82a77c8e3e66","Type":"ContainerDied","Data":"ec319f58f8bcf43b271ba1d4458d441fa369e94ac625dd5efb142e3011d67080"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727684 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727690 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727695 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727701 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727706 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727714 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727719 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727724 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.727729 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.728915 5033 generic.go:334] "Generic (PLEG): container finished" podID="b37811ed-2e89-4203-b39e-496c175d263b" containerID="ad2f65bc45ad80f1406f96b0f6f583e18711905b5fa2f9da8237249e535fd2ac" exitCode=0 Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.728956 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerDied","Data":"ad2f65bc45ad80f1406f96b0f6f583e18711905b5fa2f9da8237249e535fd2ac"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.728971 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"04cfb86d18adb926759a96db2da690b16ff430220eec8301ce4ba1b907caad81"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.731307 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5qfn_4a7b8904-0121-4d6c-849e-1ebfa3af0c61/kube-multus/0.log" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.731576 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5qfn" event={"ID":"4a7b8904-0121-4d6c-849e-1ebfa3af0c61","Type":"ContainerStarted","Data":"afba5acfcd7dc2b2fe14fe63a00bcd6c81cea2f960f4f69eeb147680e0a32ce7"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.737708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" event={"ID":"f7526813-a7dc-4074-a7ea-f791760e3cb0","Type":"ContainerDied","Data":"25a5abcb4471bda51f7aac756dabf9c84f7743db88b792d8559cb312af9ba19f"} Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.737744 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a5abcb4471bda51f7aac756dabf9c84f7743db88b792d8559cb312af9ba19f" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.737808 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.749013 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.767097 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.789959 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bk4w2"] Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.797558 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bk4w2"] Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.805730 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.819658 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.831023 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.841940 5033 scope.go:117] "RemoveContainer" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.855294 5033 scope.go:117] "RemoveContainer" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.869486 5033 scope.go:117] "RemoveContainer" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.887395 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.887925 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.887974 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} err="failed to get container status \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.888004 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.888277 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.888313 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} err="failed to get container status \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.888342 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.888781 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.888807 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} err="failed to get container status \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.888823 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.889172 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889201 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} err="failed to get container status \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889222 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.889491 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889516 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} err="failed to get container status \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889529 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.889856 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889921 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} err="failed to get container status \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.889962 5033 scope.go:117] "RemoveContainer" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.890316 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": container with ID starting with 13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc not found: ID does not exist" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.890347 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} err="failed to get container status \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": rpc error: code = NotFound desc = could not find container \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": container with ID starting with 13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.890366 5033 scope.go:117] "RemoveContainer" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.890635 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": container with ID starting with c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4 not found: ID does not exist" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.890660 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} err="failed to get container status \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": rpc error: code = NotFound desc = could not find container \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": container with ID starting with c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.890678 5033 scope.go:117] "RemoveContainer" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: E0319 19:07:27.891040 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": container with ID starting with b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085 not found: ID does not exist" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891064 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} err="failed to get container status \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": rpc error: code = NotFound desc = could not find container \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": container with ID starting with b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891089 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891440 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} err="failed to get container status \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891505 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891798 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} err="failed to get container status \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.891819 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892118 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} err="failed to get container status \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892145 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892572 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} err="failed to get container status \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892619 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892937 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} err="failed to get container status \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.892964 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.893341 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} err="failed to get container status \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.893361 5033 scope.go:117] "RemoveContainer" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.893752 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} err="failed to get container status \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": rpc error: code = NotFound desc = could not find container \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": container with ID starting with 13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.893788 5033 scope.go:117] "RemoveContainer" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.894222 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} err="failed to get container status \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": rpc error: code = NotFound desc = could not find container \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": container with ID starting with c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.894247 5033 scope.go:117] "RemoveContainer" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.894576 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} err="failed to get container status \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": rpc error: code = NotFound desc = could not find container \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": container with ID starting with b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.894595 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.894995 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} err="failed to get container status \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.895037 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.895323 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} err="failed to get container status \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.895363 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.895658 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} err="failed to get container status \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.895684 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.896092 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} err="failed to get container status \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.896108 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.896508 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} err="failed to get container status \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.896537 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.897001 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} err="failed to get container status \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.897511 5033 scope.go:117] "RemoveContainer" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.897996 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} err="failed to get container status \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": rpc error: code = NotFound desc = could not find container \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": container with ID starting with 13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.898023 5033 scope.go:117] "RemoveContainer" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.898289 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} err="failed to get container status \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": rpc error: code = NotFound desc = could not find container \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": container with ID starting with c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.898310 5033 scope.go:117] "RemoveContainer" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.898671 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} err="failed to get container status \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": rpc error: code = NotFound desc = could not find container \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": container with ID starting with b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.898698 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.899184 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} err="failed to get container status \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.899227 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.899639 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} err="failed to get container status \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.899663 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.900000 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} err="failed to get container status \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.900041 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.900693 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} err="failed to get container status \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.900720 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.901265 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} err="failed to get container status \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.901291 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.901643 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} err="failed to get container status \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.901695 5033 scope.go:117] "RemoveContainer" containerID="13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902049 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc"} err="failed to get container status \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": rpc error: code = NotFound desc = could not find container \"13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc\": container with ID starting with 13d38ff440c5bdd32cb2179464f3782a7ead81b8ab2e2ebdefc0fb0ecfda4ebc not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902073 5033 scope.go:117] "RemoveContainer" containerID="c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902357 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4"} err="failed to get container status \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": rpc error: code = NotFound desc = could not find container \"c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4\": container with ID starting with c7851cc9d90d8feabc32c8301c1739131227c0634cb1ca94cb78013bed5dc6a4 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902381 5033 scope.go:117] "RemoveContainer" containerID="b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902780 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085"} err="failed to get container status \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": rpc error: code = NotFound desc = could not find container \"b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085\": container with ID starting with b531ffd7b0692f6aee67745217162db0983dd719c95db06b899489599f254085 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.902802 5033 scope.go:117] "RemoveContainer" containerID="acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.903235 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5"} err="failed to get container status \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": rpc error: code = NotFound desc = could not find container \"acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5\": container with ID starting with acf501b0340d3e4228ebbcc1bdf0e3f4e78963c91796281bb8bce769e3fb09f5 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.903298 5033 scope.go:117] "RemoveContainer" containerID="987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.903745 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564"} err="failed to get container status \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": rpc error: code = NotFound desc = could not find container \"987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564\": container with ID starting with 987ece16754169aa5e1c0d1d39ac0aeaaa0853a78d133efb407bd76a854b3564 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.903769 5033 scope.go:117] "RemoveContainer" containerID="5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904103 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6"} err="failed to get container status \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": rpc error: code = NotFound desc = could not find container \"5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6\": container with ID starting with 5efdb734a2cd6b05c1cc3bd3f87c33ba9e38f6cb6fe2ec60b9c76249f7bd16d6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904124 5033 scope.go:117] "RemoveContainer" containerID="f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904394 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d"} err="failed to get container status \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": rpc error: code = NotFound desc = could not find container \"f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d\": container with ID starting with f7037940155da20fc0b75104415972ea520d77bdef8c34f1143b58298f88c80d not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904413 5033 scope.go:117] "RemoveContainer" containerID="1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904751 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6"} err="failed to get container status \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": rpc error: code = NotFound desc = could not find container \"1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6\": container with ID starting with 1f6d8d07d782e528c30e13fb8f4d382df641d85ea0cff4576d62107d9770fee6 not found: ID does not exist" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.904778 5033 scope.go:117] "RemoveContainer" containerID="f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7" Mar 19 19:07:27 crc kubenswrapper[5033]: I0319 19:07:27.905072 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7"} err="failed to get container status \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": rpc error: code = NotFound desc = could not find container \"f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7\": container with ID starting with f233763fa48999d064aa534f8225fab704498e00ff31745067c4fdce5cc3fcc7 not found: ID does not exist" Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.627493 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7906f1-92ba-45a7-9a54-82a77c8e3e66" path="/var/lib/kubelet/pods/cb7906f1-92ba-45a7-9a54-82a77c8e3e66/volumes" Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"a3b5ea66ea2d9dc118f5831850a770addac5980b0175e06c8b279d3951a10aba"} Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747689 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"2c1fccc2e3edb4d3bd4927b6c22146cb0e523f1388b7f211fe7df728cd9e92a0"} Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747717 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"d2439c12cb36bf31811a322b1832a00a11ddb1f770b571f44b2272b88d85aeaa"} Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"f7c34f8130ae42a245a19b6946b9c8f06eb59ad7aaf6a84869677efb342a58d3"} Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747765 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"4cdbbac25dc907ac2d51d8a4ae5626bb7206015fd32255255f708bf2a625a718"} Mar 19 19:07:28 crc kubenswrapper[5033]: I0319 19:07:28.747789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"70d8713d6374ca967133894905dac8c67b1e2a5aab25305340874ecaa7bcffb3"} Mar 19 19:07:30 crc kubenswrapper[5033]: I0319 19:07:30.763275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"5a8aaf1fc02c472bb2e7c3005785ff618b5678a1d03433c309cff7482bd08911"} Mar 19 19:07:31 crc kubenswrapper[5033]: I0319 19:07:31.053032 5033 scope.go:117] "RemoveContainer" containerID="e9064855d760b0feef7575b1fc9157300ed4eeec6cd893746000319a93786762" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.780514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" event={"ID":"b37811ed-2e89-4203-b39e-496c175d263b","Type":"ContainerStarted","Data":"acefd4c415fd576813afeb0fd9091fc98d224a3fd5657156395883133266c34e"} Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.781059 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.781137 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.781199 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.820920 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.823297 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" podStartSLOduration=7.823280892 podStartE2EDuration="7.823280892s" podCreationTimestamp="2026-03-19 19:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:07:33.82074905 +0000 UTC m=+663.925778899" watchObservedRunningTime="2026-03-19 19:07:33.823280892 +0000 UTC m=+663.928310741" Mar 19 19:07:33 crc kubenswrapper[5033]: I0319 19:07:33.832154 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.676536 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.677860 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.679819 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6jls5" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.679826 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.680409 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.708459 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.862965 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v27w\" (UniqueName: \"kubernetes.io/projected/2f90a5c2-7618-4090-b63c-6d40664ab26e-kube-api-access-6v27w\") pod \"obo-prometheus-operator-8ff7d675-hr4xj\" (UID: \"2f90a5c2-7618-4090-b63c-6d40664ab26e\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.944505 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.945085 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.946986 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-nxrrz" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.950787 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.951900 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.952590 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.957334 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.961084 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5"] Mar 19 19:07:38 crc kubenswrapper[5033]: I0319 19:07:38.964127 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v27w\" (UniqueName: \"kubernetes.io/projected/2f90a5c2-7618-4090-b63c-6d40664ab26e-kube-api-access-6v27w\") pod \"obo-prometheus-operator-8ff7d675-hr4xj\" (UID: \"2f90a5c2-7618-4090-b63c-6d40664ab26e\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:38.994317 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v27w\" (UniqueName: \"kubernetes.io/projected/2f90a5c2-7618-4090-b63c-6d40664ab26e-kube-api-access-6v27w\") pod \"obo-prometheus-operator-8ff7d675-hr4xj\" (UID: \"2f90a5c2-7618-4090-b63c-6d40664ab26e\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:38.999799 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.065364 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.065429 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.065477 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.065564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.170215 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.170266 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.170304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.170340 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.177266 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.198240 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.208919 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/88b04cc7-0103-4c0a-bd35-421e81888064-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj\" (UID: \"88b04cc7-0103-4c0a-bd35-421e81888064\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.217934 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bff649f-6d36-42eb-8419-aebbe076b40c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5\" (UID: \"0bff649f-6d36-42eb-8419-aebbe076b40c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.266145 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.277908 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.347079 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj"] Mar 19 19:07:39 crc kubenswrapper[5033]: W0319 19:07:39.451607 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f90a5c2_7618_4090_b63c_6d40664ab26e.slice/crio-b87a5b773c4dcaa73296d26c8f0dab5523d34970066d0a79cc060c11a0733423 WatchSource:0}: Error finding container b87a5b773c4dcaa73296d26c8f0dab5523d34970066d0a79cc060c11a0733423: Status 404 returned error can't find the container with id b87a5b773c4dcaa73296d26c8f0dab5523d34970066d0a79cc060c11a0733423 Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.491108 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-v94xx"] Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.491904 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.495795 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.495870 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-7pzdj" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.512591 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-v94xx"] Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.577823 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsc9\" (UniqueName: \"kubernetes.io/projected/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-kube-api-access-wdsc9\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.577965 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.605016 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj"] Mar 19 19:07:39 crc kubenswrapper[5033]: W0319 19:07:39.615416 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b04cc7_0103_4c0a_bd35_421e81888064.slice/crio-d851440fae8e8b015a35ee5a31ff28075db3690aac195b54411b4ae47ab563d0 WatchSource:0}: Error finding container d851440fae8e8b015a35ee5a31ff28075db3690aac195b54411b4ae47ab563d0: Status 404 returned error can't find the container with id d851440fae8e8b015a35ee5a31ff28075db3690aac195b54411b4ae47ab563d0 Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.622921 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5"] Mar 19 19:07:39 crc kubenswrapper[5033]: W0319 19:07:39.634851 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bff649f_6d36_42eb_8419_aebbe076b40c.slice/crio-a4abbcb615916907378b09ec71b2a277c9404dd47d9849d84421785e5791218d WatchSource:0}: Error finding container a4abbcb615916907378b09ec71b2a277c9404dd47d9849d84421785e5791218d: Status 404 returned error can't find the container with id a4abbcb615916907378b09ec71b2a277c9404dd47d9849d84421785e5791218d Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.678642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsc9\" (UniqueName: \"kubernetes.io/projected/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-kube-api-access-wdsc9\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.678745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.684072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.700293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsc9\" (UniqueName: \"kubernetes.io/projected/92efd1e2-4b4b-48b0-a991-1c3cfe62eef3-kube-api-access-wdsc9\") pod \"observability-operator-6dd7dd855f-v94xx\" (UID: \"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3\") " pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.795898 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-558f99686-58c2z"] Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.796571 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.802691 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.803096 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7xh4n" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.810557 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" event={"ID":"2f90a5c2-7618-4090-b63c-6d40664ab26e","Type":"ContainerStarted","Data":"b87a5b773c4dcaa73296d26c8f0dab5523d34970066d0a79cc060c11a0733423"} Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.812321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" event={"ID":"88b04cc7-0103-4c0a-bd35-421e81888064","Type":"ContainerStarted","Data":"d851440fae8e8b015a35ee5a31ff28075db3690aac195b54411b4ae47ab563d0"} Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.815509 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" event={"ID":"0bff649f-6d36-42eb-8419-aebbe076b40c","Type":"ContainerStarted","Data":"a4abbcb615916907378b09ec71b2a277c9404dd47d9849d84421785e5791218d"} Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.830043 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.856500 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-558f99686-58c2z"] Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.879958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7g92\" (UniqueName: \"kubernetes.io/projected/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-kube-api-access-z7g92\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.880041 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-openshift-service-ca\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.880063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-webhook-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.880086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-apiservice-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.991046 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7g92\" (UniqueName: \"kubernetes.io/projected/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-kube-api-access-z7g92\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.991136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-openshift-service-ca\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.991155 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-webhook-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.991177 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-apiservice-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:39 crc kubenswrapper[5033]: I0319 19:07:39.996745 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-openshift-service-ca\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.002041 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-webhook-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.003074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-apiservice-cert\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.047667 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7g92\" (UniqueName: \"kubernetes.io/projected/5f1646e5-4c53-4c2f-93f0-b32741aa44ae-kube-api-access-z7g92\") pod \"perses-operator-558f99686-58c2z\" (UID: \"5f1646e5-4c53-4c2f-93f0-b32741aa44ae\") " pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.116917 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.150671 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-v94xx"] Mar 19 19:07:40 crc kubenswrapper[5033]: W0319 19:07:40.169306 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92efd1e2_4b4b_48b0_a991_1c3cfe62eef3.slice/crio-d265340776a755a76e88bfb612af563f4b52267642d158b8dfdd642358d7d287 WatchSource:0}: Error finding container d265340776a755a76e88bfb612af563f4b52267642d158b8dfdd642358d7d287: Status 404 returned error can't find the container with id d265340776a755a76e88bfb612af563f4b52267642d158b8dfdd642358d7d287 Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.343358 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-558f99686-58c2z"] Mar 19 19:07:40 crc kubenswrapper[5033]: W0319 19:07:40.348491 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f1646e5_4c53_4c2f_93f0_b32741aa44ae.slice/crio-f17da22a4588c6353bf42fab5ddba24b5b68c56ffd8012bf6194680a12effd60 WatchSource:0}: Error finding container f17da22a4588c6353bf42fab5ddba24b5b68c56ffd8012bf6194680a12effd60: Status 404 returned error can't find the container with id f17da22a4588c6353bf42fab5ddba24b5b68c56ffd8012bf6194680a12effd60 Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.759117 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.759182 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.821916 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-558f99686-58c2z" event={"ID":"5f1646e5-4c53-4c2f-93f0-b32741aa44ae","Type":"ContainerStarted","Data":"f17da22a4588c6353bf42fab5ddba24b5b68c56ffd8012bf6194680a12effd60"} Mar 19 19:07:40 crc kubenswrapper[5033]: I0319 19:07:40.822644 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" event={"ID":"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3","Type":"ContainerStarted","Data":"d265340776a755a76e88bfb612af563f4b52267642d158b8dfdd642358d7d287"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.876407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" event={"ID":"0bff649f-6d36-42eb-8419-aebbe076b40c","Type":"ContainerStarted","Data":"f765b068626e3cf12ed429f88c3dcad9cd6edcce6c51a38b87be4adf168906d2"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.877496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" event={"ID":"88b04cc7-0103-4c0a-bd35-421e81888064","Type":"ContainerStarted","Data":"711abcd2befb26a0246113f0655f8bbd10a0740a0472e6dd49887e2ad1ac5aa7"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.879595 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" event={"ID":"2f90a5c2-7618-4090-b63c-6d40664ab26e","Type":"ContainerStarted","Data":"717c57211b4f3d140afdaffb8c1f820c651e5e658f89ce1d5a8abf2d89996ae7"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.880513 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-558f99686-58c2z" event={"ID":"5f1646e5-4c53-4c2f-93f0-b32741aa44ae","Type":"ContainerStarted","Data":"cbef3c79db8385eff30f1c5044350538f4b0bb2ee442d811d912525bd2717a21"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.880776 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.881686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" event={"ID":"92efd1e2-4b4b-48b0-a991-1c3cfe62eef3","Type":"ContainerStarted","Data":"92274e79f31289701077b068b64013c879a8ce66dce21d7e6e624e60debb151a"} Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.881982 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.893346 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.911387 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5" podStartSLOduration=2.0624060379999998 podStartE2EDuration="11.911365312s" podCreationTimestamp="2026-03-19 19:07:38 +0000 UTC" firstStartedPulling="2026-03-19 19:07:39.636708151 +0000 UTC m=+669.741738000" lastFinishedPulling="2026-03-19 19:07:49.485667425 +0000 UTC m=+679.590697274" observedRunningTime="2026-03-19 19:07:49.901363289 +0000 UTC m=+680.006393148" watchObservedRunningTime="2026-03-19 19:07:49.911365312 +0000 UTC m=+680.016395161" Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.931322 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-v94xx" podStartSLOduration=1.794354649 podStartE2EDuration="10.931306057s" podCreationTimestamp="2026-03-19 19:07:39 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.171268791 +0000 UTC m=+670.276298640" lastFinishedPulling="2026-03-19 19:07:49.308220199 +0000 UTC m=+679.413250048" observedRunningTime="2026-03-19 19:07:49.929183437 +0000 UTC m=+680.034213296" watchObservedRunningTime="2026-03-19 19:07:49.931306057 +0000 UTC m=+680.036335906" Mar 19 19:07:49 crc kubenswrapper[5033]: I0319 19:07:49.968928 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-558f99686-58c2z" podStartSLOduration=2.039381278 podStartE2EDuration="10.968907552s" podCreationTimestamp="2026-03-19 19:07:39 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.350995131 +0000 UTC m=+670.456024990" lastFinishedPulling="2026-03-19 19:07:49.280521415 +0000 UTC m=+679.385551264" observedRunningTime="2026-03-19 19:07:49.960704039 +0000 UTC m=+680.065733888" watchObservedRunningTime="2026-03-19 19:07:49.968907552 +0000 UTC m=+680.073937401" Mar 19 19:07:50 crc kubenswrapper[5033]: I0319 19:07:50.008054 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj" podStartSLOduration=2.375896107 podStartE2EDuration="12.00803299s" podCreationTimestamp="2026-03-19 19:07:38 +0000 UTC" firstStartedPulling="2026-03-19 19:07:39.621634814 +0000 UTC m=+669.726664663" lastFinishedPulling="2026-03-19 19:07:49.253771697 +0000 UTC m=+679.358801546" observedRunningTime="2026-03-19 19:07:50.004895151 +0000 UTC m=+680.109925000" watchObservedRunningTime="2026-03-19 19:07:50.00803299 +0000 UTC m=+680.113062849" Mar 19 19:07:50 crc kubenswrapper[5033]: I0319 19:07:50.009415 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-hr4xj" podStartSLOduration=2.188627053 podStartE2EDuration="12.009406569s" podCreationTimestamp="2026-03-19 19:07:38 +0000 UTC" firstStartedPulling="2026-03-19 19:07:39.453047469 +0000 UTC m=+669.558077318" lastFinishedPulling="2026-03-19 19:07:49.273826985 +0000 UTC m=+679.378856834" observedRunningTime="2026-03-19 19:07:49.989055992 +0000 UTC m=+680.094085871" watchObservedRunningTime="2026-03-19 19:07:50.009406569 +0000 UTC m=+680.114436418" Mar 19 19:07:57 crc kubenswrapper[5033]: I0319 19:07:57.313060 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2rkj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.104551 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hggdg"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.105411 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.107998 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.108048 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.110346 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sxrsb" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.114343 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tqbxh"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.115016 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tqbxh" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.118709 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-cpsxn" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.141752 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tqbxh"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.145548 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9tjpj"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.146383 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.147793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-g69kv" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.152133 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9tjpj"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.171043 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hggdg"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.243296 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhs8p\" (UniqueName: \"kubernetes.io/projected/4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd-kube-api-access-fhs8p\") pod \"cert-manager-858654f9db-tqbxh\" (UID: \"4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd\") " pod="cert-manager/cert-manager-858654f9db-tqbxh" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.243341 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbm6\" (UniqueName: \"kubernetes.io/projected/9a693b77-c591-4899-9aba-6f674eac5601-kube-api-access-7gbm6\") pod \"cert-manager-cainjector-cf98fcc89-hggdg\" (UID: \"9a693b77-c591-4899-9aba-6f674eac5601\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.243388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm822\" (UniqueName: \"kubernetes.io/projected/13744f00-1ee4-48fb-a839-d8bb7e4d7a7b-kube-api-access-wm822\") pod \"cert-manager-webhook-687f57d79b-9tjpj\" (UID: \"13744f00-1ee4-48fb-a839-d8bb7e4d7a7b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.344211 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhs8p\" (UniqueName: \"kubernetes.io/projected/4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd-kube-api-access-fhs8p\") pod \"cert-manager-858654f9db-tqbxh\" (UID: \"4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd\") " pod="cert-manager/cert-manager-858654f9db-tqbxh" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.344258 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbm6\" (UniqueName: \"kubernetes.io/projected/9a693b77-c591-4899-9aba-6f674eac5601-kube-api-access-7gbm6\") pod \"cert-manager-cainjector-cf98fcc89-hggdg\" (UID: \"9a693b77-c591-4899-9aba-6f674eac5601\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.344304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm822\" (UniqueName: \"kubernetes.io/projected/13744f00-1ee4-48fb-a839-d8bb7e4d7a7b-kube-api-access-wm822\") pod \"cert-manager-webhook-687f57d79b-9tjpj\" (UID: \"13744f00-1ee4-48fb-a839-d8bb7e4d7a7b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.361739 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm822\" (UniqueName: \"kubernetes.io/projected/13744f00-1ee4-48fb-a839-d8bb7e4d7a7b-kube-api-access-wm822\") pod \"cert-manager-webhook-687f57d79b-9tjpj\" (UID: \"13744f00-1ee4-48fb-a839-d8bb7e4d7a7b\") " pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.362359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbm6\" (UniqueName: \"kubernetes.io/projected/9a693b77-c591-4899-9aba-6f674eac5601-kube-api-access-7gbm6\") pod \"cert-manager-cainjector-cf98fcc89-hggdg\" (UID: \"9a693b77-c591-4899-9aba-6f674eac5601\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.369416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhs8p\" (UniqueName: \"kubernetes.io/projected/4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd-kube-api-access-fhs8p\") pod \"cert-manager-858654f9db-tqbxh\" (UID: \"4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd\") " pod="cert-manager/cert-manager-858654f9db-tqbxh" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.425858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.441050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tqbxh" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.458033 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.895141 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-hggdg"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.931381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" event={"ID":"9a693b77-c591-4899-9aba-6f674eac5601","Type":"ContainerStarted","Data":"e84f41f1e061a618bf21b5e79d05ff9c77c566423c33a5aee70b43605efcec6d"} Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.968255 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tqbxh"] Mar 19 19:07:59 crc kubenswrapper[5033]: I0319 19:07:59.971650 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-9tjpj"] Mar 19 19:07:59 crc kubenswrapper[5033]: W0319 19:07:59.973682 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13744f00_1ee4_48fb_a839_d8bb7e4d7a7b.slice/crio-2e29da20e070e1ec234230bf81500e1c37ca86b6e95f7e3bdb107a56f9f2ecb7 WatchSource:0}: Error finding container 2e29da20e070e1ec234230bf81500e1c37ca86b6e95f7e3bdb107a56f9f2ecb7: Status 404 returned error can't find the container with id 2e29da20e070e1ec234230bf81500e1c37ca86b6e95f7e3bdb107a56f9f2ecb7 Mar 19 19:07:59 crc kubenswrapper[5033]: W0319 19:07:59.974483 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dd5d60d_d4a3_4c4a_a194_b5c39e7bc4fd.slice/crio-5e8aafc1a39708f684f8d828ff9e09c425faa0d557c0d8693fec25809d35057f WatchSource:0}: Error finding container 5e8aafc1a39708f684f8d828ff9e09c425faa0d557c0d8693fec25809d35057f: Status 404 returned error can't find the container with id 5e8aafc1a39708f684f8d828ff9e09c425faa0d557c0d8693fec25809d35057f Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.120263 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-558f99686-58c2z" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.126745 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565788-xl8gh"] Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.127474 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.128682 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.129271 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.130014 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.142122 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-xl8gh"] Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.255544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rljlr\" (UniqueName: \"kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr\") pod \"auto-csr-approver-29565788-xl8gh\" (UID: \"e9bea9e5-2621-4992-81ad-63612a4d5460\") " pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.356284 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rljlr\" (UniqueName: \"kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr\") pod \"auto-csr-approver-29565788-xl8gh\" (UID: \"e9bea9e5-2621-4992-81ad-63612a4d5460\") " pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.383309 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rljlr\" (UniqueName: \"kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr\") pod \"auto-csr-approver-29565788-xl8gh\" (UID: \"e9bea9e5-2621-4992-81ad-63612a4d5460\") " pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.444514 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.680607 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-xl8gh"] Mar 19 19:08:00 crc kubenswrapper[5033]: I0319 19:08:00.997808 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tqbxh" event={"ID":"4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd","Type":"ContainerStarted","Data":"5e8aafc1a39708f684f8d828ff9e09c425faa0d557c0d8693fec25809d35057f"} Mar 19 19:08:01 crc kubenswrapper[5033]: I0319 19:08:01.011751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" event={"ID":"e9bea9e5-2621-4992-81ad-63612a4d5460","Type":"ContainerStarted","Data":"01120e071ea7d5691a7ca6e2482a44cd340e53484e72a5182b69d1344f462f98"} Mar 19 19:08:01 crc kubenswrapper[5033]: I0319 19:08:01.019084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" event={"ID":"13744f00-1ee4-48fb-a839-d8bb7e4d7a7b","Type":"ContainerStarted","Data":"2e29da20e070e1ec234230bf81500e1c37ca86b6e95f7e3bdb107a56f9f2ecb7"} Mar 19 19:08:03 crc kubenswrapper[5033]: I0319 19:08:03.037279 5033 generic.go:334] "Generic (PLEG): container finished" podID="e9bea9e5-2621-4992-81ad-63612a4d5460" containerID="1edfa7a204497bf4920bb9aec8872ccdac51c6edf2d1bd3167d3f9b69c18edee" exitCode=0 Mar 19 19:08:03 crc kubenswrapper[5033]: I0319 19:08:03.037401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" event={"ID":"e9bea9e5-2621-4992-81ad-63612a4d5460","Type":"ContainerDied","Data":"1edfa7a204497bf4920bb9aec8872ccdac51c6edf2d1bd3167d3f9b69c18edee"} Mar 19 19:08:04 crc kubenswrapper[5033]: I0319 19:08:04.945423 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.044561 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rljlr\" (UniqueName: \"kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr\") pod \"e9bea9e5-2621-4992-81ad-63612a4d5460\" (UID: \"e9bea9e5-2621-4992-81ad-63612a4d5460\") " Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.046991 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" event={"ID":"e9bea9e5-2621-4992-81ad-63612a4d5460","Type":"ContainerDied","Data":"01120e071ea7d5691a7ca6e2482a44cd340e53484e72a5182b69d1344f462f98"} Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.047022 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01120e071ea7d5691a7ca6e2482a44cd340e53484e72a5182b69d1344f462f98" Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.047070 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-xl8gh" Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.050394 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr" (OuterVolumeSpecName: "kube-api-access-rljlr") pod "e9bea9e5-2621-4992-81ad-63612a4d5460" (UID: "e9bea9e5-2621-4992-81ad-63612a4d5460"). InnerVolumeSpecName "kube-api-access-rljlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:08:05 crc kubenswrapper[5033]: I0319 19:08:05.146311 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rljlr\" (UniqueName: \"kubernetes.io/projected/e9bea9e5-2621-4992-81ad-63612a4d5460-kube-api-access-rljlr\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.011948 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-s7btv"] Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.033355 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-s7btv"] Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.052654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" event={"ID":"9a693b77-c591-4899-9aba-6f674eac5601","Type":"ContainerStarted","Data":"dec4f0ec6277a7f5bd886cb0c60f5a2bcf8c32dd25583669b0a3194f8d0fe0c1"} Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.054240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tqbxh" event={"ID":"4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd","Type":"ContainerStarted","Data":"ffa01d9641ac3d2434d6e2bec53d5af63942da69bc6ec37b66282a105b831155"} Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.055538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" event={"ID":"13744f00-1ee4-48fb-a839-d8bb7e4d7a7b","Type":"ContainerStarted","Data":"acaffff306079cb730ce0ea151f003ca00e7532508a6d8a1f59d8a8a5a5f78db"} Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.055862 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.065742 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-hggdg" podStartSLOduration=1.869928603 podStartE2EDuration="7.065722389s" podCreationTimestamp="2026-03-19 19:07:59 +0000 UTC" firstStartedPulling="2026-03-19 19:07:59.903403398 +0000 UTC m=+690.008433247" lastFinishedPulling="2026-03-19 19:08:05.099197184 +0000 UTC m=+695.204227033" observedRunningTime="2026-03-19 19:08:06.06541573 +0000 UTC m=+696.170445579" watchObservedRunningTime="2026-03-19 19:08:06.065722389 +0000 UTC m=+696.170752238" Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.116759 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" podStartSLOduration=1.994259153 podStartE2EDuration="7.116745134s" podCreationTimestamp="2026-03-19 19:07:59 +0000 UTC" firstStartedPulling="2026-03-19 19:07:59.975979003 +0000 UTC m=+690.081008852" lastFinishedPulling="2026-03-19 19:08:05.098445393 +0000 UTC m=+695.203494833" observedRunningTime="2026-03-19 19:08:06.091741106 +0000 UTC m=+696.196770965" watchObservedRunningTime="2026-03-19 19:08:06.116745134 +0000 UTC m=+696.221774983" Mar 19 19:08:06 crc kubenswrapper[5033]: I0319 19:08:06.626145 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f376a12-8710-4832-9d22-014c54f33dfd" path="/var/lib/kubelet/pods/8f376a12-8710-4832-9d22-014c54f33dfd/volumes" Mar 19 19:08:10 crc kubenswrapper[5033]: I0319 19:08:10.758481 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:08:10 crc kubenswrapper[5033]: I0319 19:08:10.759180 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:08:10 crc kubenswrapper[5033]: I0319 19:08:10.759256 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:08:10 crc kubenswrapper[5033]: I0319 19:08:10.760062 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:08:10 crc kubenswrapper[5033]: I0319 19:08:10.760160 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc" gracePeriod=600 Mar 19 19:08:11 crc kubenswrapper[5033]: I0319 19:08:11.100203 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc" exitCode=0 Mar 19 19:08:11 crc kubenswrapper[5033]: I0319 19:08:11.100444 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc"} Mar 19 19:08:11 crc kubenswrapper[5033]: I0319 19:08:11.100504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8"} Mar 19 19:08:11 crc kubenswrapper[5033]: I0319 19:08:11.100523 5033 scope.go:117] "RemoveContainer" containerID="8b6d2f60acc26def0ca852563cda7c05af8090f9b045e147bf602317d8c7a4b3" Mar 19 19:08:11 crc kubenswrapper[5033]: I0319 19:08:11.118017 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tqbxh" podStartSLOduration=6.93765087 podStartE2EDuration="12.117994739s" podCreationTimestamp="2026-03-19 19:07:59 +0000 UTC" firstStartedPulling="2026-03-19 19:07:59.979922735 +0000 UTC m=+690.084952584" lastFinishedPulling="2026-03-19 19:08:05.160266594 +0000 UTC m=+695.265296453" observedRunningTime="2026-03-19 19:08:06.124518174 +0000 UTC m=+696.229548023" watchObservedRunningTime="2026-03-19 19:08:11.117994739 +0000 UTC m=+701.223024588" Mar 19 19:08:14 crc kubenswrapper[5033]: I0319 19:08:14.484371 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-9tjpj" Mar 19 19:08:31 crc kubenswrapper[5033]: I0319 19:08:31.108901 5033 scope.go:117] "RemoveContainer" containerID="9b24b8d1b2d10a3ef83856d75f1db6cb89c45753c951b265f963db706c337903" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.298230 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb"] Mar 19 19:08:43 crc kubenswrapper[5033]: E0319 19:08:43.299842 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bea9e5-2621-4992-81ad-63612a4d5460" containerName="oc" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.299917 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bea9e5-2621-4992-81ad-63612a4d5460" containerName="oc" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.300087 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bea9e5-2621-4992-81ad-63612a4d5460" containerName="oc" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.300913 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.303074 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.311796 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb"] Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.344465 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.344724 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.344887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94vt\" (UniqueName: \"kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.446121 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.446243 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.446321 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94vt\" (UniqueName: \"kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.448943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.449191 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.464602 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94vt\" (UniqueName: \"kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.627503 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.842954 5033 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 19:08:43 crc kubenswrapper[5033]: I0319 19:08:43.928001 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb"] Mar 19 19:08:44 crc kubenswrapper[5033]: I0319 19:08:44.333329 5033 generic.go:334] "Generic (PLEG): container finished" podID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerID="69255b9c3abb802341e532535d9cf4b586fd196657d2dc7d3c56d20ac7fc521e" exitCode=0 Mar 19 19:08:44 crc kubenswrapper[5033]: I0319 19:08:44.333420 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" event={"ID":"3470cf1f-22c6-4e0b-b298-7500f269fba3","Type":"ContainerDied","Data":"69255b9c3abb802341e532535d9cf4b586fd196657d2dc7d3c56d20ac7fc521e"} Mar 19 19:08:44 crc kubenswrapper[5033]: I0319 19:08:44.333652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" event={"ID":"3470cf1f-22c6-4e0b-b298-7500f269fba3","Type":"ContainerStarted","Data":"c18fa2cb27bb797605c5ff078f9473e03fdc998c7771bb1d21ad5fae062cd5ba"} Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.640287 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.642767 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.652580 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.781919 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zst\" (UniqueName: \"kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.781957 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.782003 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.867490 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.868501 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.871575 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.873221 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.882479 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.882951 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zst\" (UniqueName: \"kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.882994 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.883081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.883580 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.883645 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.908620 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zst\" (UniqueName: \"kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst\") pod \"redhat-operators-ftwvs\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.973247 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.984941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:45 crc kubenswrapper[5033]: I0319 19:08:45.985006 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4xx\" (UniqueName: \"kubernetes.io/projected/55c757d5-7d33-4ece-867a-899d5a4503a4-kube-api-access-7w4xx\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.086123 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.086529 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4xx\" (UniqueName: \"kubernetes.io/projected/55c757d5-7d33-4ece-867a-899d5a4503a4-kube-api-access-7w4xx\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.089178 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.089224 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c48e4f03d36241011b83eb7a2fc47aa7281b74a43bfbc9b981ccd0260b5e06a/globalmount\"" pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.112279 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-546fca10-9f37-4ba2-b571-6ee5f2eca8d7\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.119937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4xx\" (UniqueName: \"kubernetes.io/projected/55c757d5-7d33-4ece-867a-899d5a4503a4-kube-api-access-7w4xx\") pod \"minio\" (UID: \"55c757d5-7d33-4ece-867a-899d5a4503a4\") " pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.177582 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.183028 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 19:08:46 crc kubenswrapper[5033]: W0319 19:08:46.205836 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1c346d_612f_4607_9952_56531d85d308.slice/crio-505819775b0bb6bc7c41c235fe1f21146984423a031dc28ea09b7e48d5bd111d WatchSource:0}: Error finding container 505819775b0bb6bc7c41c235fe1f21146984423a031dc28ea09b7e48d5bd111d: Status 404 returned error can't find the container with id 505819775b0bb6bc7c41c235fe1f21146984423a031dc28ea09b7e48d5bd111d Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.346515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerStarted","Data":"505819775b0bb6bc7c41c235fe1f21146984423a031dc28ea09b7e48d5bd111d"} Mar 19 19:08:46 crc kubenswrapper[5033]: I0319 19:08:46.637953 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 19:08:46 crc kubenswrapper[5033]: W0319 19:08:46.641928 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c757d5_7d33_4ece_867a_899d5a4503a4.slice/crio-f59c6d3fff74a23725a7c2596a27b004778a343074bdc8f7b6ee1351dc031e9c WatchSource:0}: Error finding container f59c6d3fff74a23725a7c2596a27b004778a343074bdc8f7b6ee1351dc031e9c: Status 404 returned error can't find the container with id f59c6d3fff74a23725a7c2596a27b004778a343074bdc8f7b6ee1351dc031e9c Mar 19 19:08:47 crc kubenswrapper[5033]: I0319 19:08:47.352820 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"55c757d5-7d33-4ece-867a-899d5a4503a4","Type":"ContainerStarted","Data":"f59c6d3fff74a23725a7c2596a27b004778a343074bdc8f7b6ee1351dc031e9c"} Mar 19 19:08:47 crc kubenswrapper[5033]: I0319 19:08:47.354713 5033 generic.go:334] "Generic (PLEG): container finished" podID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerID="c6068fd16e4f5698290ddde877be276a66be0d0621af17058b04c349bb33f57c" exitCode=0 Mar 19 19:08:47 crc kubenswrapper[5033]: I0319 19:08:47.354761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" event={"ID":"3470cf1f-22c6-4e0b-b298-7500f269fba3","Type":"ContainerDied","Data":"c6068fd16e4f5698290ddde877be276a66be0d0621af17058b04c349bb33f57c"} Mar 19 19:08:47 crc kubenswrapper[5033]: I0319 19:08:47.357984 5033 generic.go:334] "Generic (PLEG): container finished" podID="ba1c346d-612f-4607-9952-56531d85d308" containerID="c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec" exitCode=0 Mar 19 19:08:47 crc kubenswrapper[5033]: I0319 19:08:47.358024 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerDied","Data":"c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec"} Mar 19 19:08:48 crc kubenswrapper[5033]: I0319 19:08:48.371462 5033 generic.go:334] "Generic (PLEG): container finished" podID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerID="4aea9fd8bc3d29edeb3510612172fa4bead9932cd223450a9431db7a574a7d60" exitCode=0 Mar 19 19:08:48 crc kubenswrapper[5033]: I0319 19:08:48.371653 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" event={"ID":"3470cf1f-22c6-4e0b-b298-7500f269fba3","Type":"ContainerDied","Data":"4aea9fd8bc3d29edeb3510612172fa4bead9932cd223450a9431db7a574a7d60"} Mar 19 19:08:48 crc kubenswrapper[5033]: I0319 19:08:48.373383 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerStarted","Data":"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11"} Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.385228 5033 generic.go:334] "Generic (PLEG): container finished" podID="ba1c346d-612f-4607-9952-56531d85d308" containerID="c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11" exitCode=0 Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.385317 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerDied","Data":"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11"} Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.695946 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.838812 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle\") pod \"3470cf1f-22c6-4e0b-b298-7500f269fba3\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.838883 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94vt\" (UniqueName: \"kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt\") pod \"3470cf1f-22c6-4e0b-b298-7500f269fba3\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.838957 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util\") pod \"3470cf1f-22c6-4e0b-b298-7500f269fba3\" (UID: \"3470cf1f-22c6-4e0b-b298-7500f269fba3\") " Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.840071 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle" (OuterVolumeSpecName: "bundle") pod "3470cf1f-22c6-4e0b-b298-7500f269fba3" (UID: "3470cf1f-22c6-4e0b-b298-7500f269fba3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.844388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt" (OuterVolumeSpecName: "kube-api-access-k94vt") pod "3470cf1f-22c6-4e0b-b298-7500f269fba3" (UID: "3470cf1f-22c6-4e0b-b298-7500f269fba3"). InnerVolumeSpecName "kube-api-access-k94vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.865623 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util" (OuterVolumeSpecName: "util") pod "3470cf1f-22c6-4e0b-b298-7500f269fba3" (UID: "3470cf1f-22c6-4e0b-b298-7500f269fba3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.940085 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.940117 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94vt\" (UniqueName: \"kubernetes.io/projected/3470cf1f-22c6-4e0b-b298-7500f269fba3-kube-api-access-k94vt\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:49 crc kubenswrapper[5033]: I0319 19:08:49.940128 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3470cf1f-22c6-4e0b-b298-7500f269fba3-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.393405 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" event={"ID":"3470cf1f-22c6-4e0b-b298-7500f269fba3","Type":"ContainerDied","Data":"c18fa2cb27bb797605c5ff078f9473e03fdc998c7771bb1d21ad5fae062cd5ba"} Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.393729 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c18fa2cb27bb797605c5ff078f9473e03fdc998c7771bb1d21ad5fae062cd5ba" Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.393502 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb" Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.396886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerStarted","Data":"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860"} Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.398665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"55c757d5-7d33-4ece-867a-899d5a4503a4","Type":"ContainerStarted","Data":"00175bc0f5fb13a8b8edf975c72a984e2875a5826980f2632533bfdd44e13915"} Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.425478 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ftwvs" podStartSLOduration=2.9253280630000003 podStartE2EDuration="5.425459851s" podCreationTimestamp="2026-03-19 19:08:45 +0000 UTC" firstStartedPulling="2026-03-19 19:08:47.358953004 +0000 UTC m=+737.463982853" lastFinishedPulling="2026-03-19 19:08:49.859084792 +0000 UTC m=+739.964114641" observedRunningTime="2026-03-19 19:08:50.422442066 +0000 UTC m=+740.527471935" watchObservedRunningTime="2026-03-19 19:08:50.425459851 +0000 UTC m=+740.530489700" Mar 19 19:08:50 crc kubenswrapper[5033]: I0319 19:08:50.436580 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.386897384 podStartE2EDuration="7.436556675s" podCreationTimestamp="2026-03-19 19:08:43 +0000 UTC" firstStartedPulling="2026-03-19 19:08:46.643901776 +0000 UTC m=+736.748931625" lastFinishedPulling="2026-03-19 19:08:49.693561067 +0000 UTC m=+739.798590916" observedRunningTime="2026-03-19 19:08:50.435371572 +0000 UTC m=+740.540401431" watchObservedRunningTime="2026-03-19 19:08:50.436556675 +0000 UTC m=+740.541586534" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.935874 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n"] Mar 19 19:08:55 crc kubenswrapper[5033]: E0319 19:08:55.936465 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="extract" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.936482 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="extract" Mar 19 19:08:55 crc kubenswrapper[5033]: E0319 19:08:55.936500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="util" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.936507 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="util" Mar 19 19:08:55 crc kubenswrapper[5033]: E0319 19:08:55.936525 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="pull" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.936531 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="pull" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.936654 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3470cf1f-22c6-4e0b-b298-7500f269fba3" containerName="extract" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.938047 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.962209 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.962240 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.962424 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.962839 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-mb9hh" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.963019 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.963047 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.967696 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n"] Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.976591 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:55 crc kubenswrapper[5033]: I0319 19:08:55.976829 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.018158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgczw\" (UniqueName: \"kubernetes.io/projected/25d27288-ba82-4c74-a864-b5e54e4be246-kube-api-access-jgczw\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.018306 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.018340 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/25d27288-ba82-4c74-a864-b5e54e4be246-manager-config\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.018384 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-apiservice-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.018407 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-webhook-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.120138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgczw\" (UniqueName: \"kubernetes.io/projected/25d27288-ba82-4c74-a864-b5e54e4be246-kube-api-access-jgczw\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.120292 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.120414 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/25d27288-ba82-4c74-a864-b5e54e4be246-manager-config\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.120502 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-apiservice-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.120529 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-webhook-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.121361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/25d27288-ba82-4c74-a864-b5e54e4be246-manager-config\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.126008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-webhook-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.126021 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.127648 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25d27288-ba82-4c74-a864-b5e54e4be246-apiservice-cert\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.138650 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgczw\" (UniqueName: \"kubernetes.io/projected/25d27288-ba82-4c74-a864-b5e54e4be246-kube-api-access-jgczw\") pod \"loki-operator-controller-manager-767b88fbc9-6bv6n\" (UID: \"25d27288-ba82-4c74-a864-b5e54e4be246\") " pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.273676 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:08:56 crc kubenswrapper[5033]: I0319 19:08:56.711406 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n"] Mar 19 19:08:57 crc kubenswrapper[5033]: I0319 19:08:57.036234 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ftwvs" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="registry-server" probeResult="failure" output=< Mar 19 19:08:57 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:08:57 crc kubenswrapper[5033]: > Mar 19 19:08:57 crc kubenswrapper[5033]: I0319 19:08:57.446014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" event={"ID":"25d27288-ba82-4c74-a864-b5e54e4be246","Type":"ContainerStarted","Data":"6923a74a13fb6b396fa615c937849594910cb8a69f103f358ac451a69187a3e7"} Mar 19 19:09:02 crc kubenswrapper[5033]: I0319 19:09:02.382646 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:09:02 crc kubenswrapper[5033]: I0319 19:09:02.475911 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" event={"ID":"25d27288-ba82-4c74-a864-b5e54e4be246","Type":"ContainerStarted","Data":"16e79bc9cbc28b9d860e2cb692b7d96a1605282b88fdaa1e1363ca3cd8a96280"} Mar 19 19:09:06 crc kubenswrapper[5033]: I0319 19:09:06.031486 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:09:06 crc kubenswrapper[5033]: I0319 19:09:06.088614 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:09:08 crc kubenswrapper[5033]: I0319 19:09:08.429740 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:09:08 crc kubenswrapper[5033]: I0319 19:09:08.429941 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ftwvs" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="registry-server" containerID="cri-o://ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860" gracePeriod=2 Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.235005 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.308298 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zst\" (UniqueName: \"kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst\") pod \"ba1c346d-612f-4607-9952-56531d85d308\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.309534 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities\") pod \"ba1c346d-612f-4607-9952-56531d85d308\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.309644 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content\") pod \"ba1c346d-612f-4607-9952-56531d85d308\" (UID: \"ba1c346d-612f-4607-9952-56531d85d308\") " Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.311375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities" (OuterVolumeSpecName: "utilities") pod "ba1c346d-612f-4607-9952-56531d85d308" (UID: "ba1c346d-612f-4607-9952-56531d85d308"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.316428 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst" (OuterVolumeSpecName: "kube-api-access-z9zst") pod "ba1c346d-612f-4607-9952-56531d85d308" (UID: "ba1c346d-612f-4607-9952-56531d85d308"). InnerVolumeSpecName "kube-api-access-z9zst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.412864 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.412903 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zst\" (UniqueName: \"kubernetes.io/projected/ba1c346d-612f-4607-9952-56531d85d308-kube-api-access-z9zst\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.464324 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba1c346d-612f-4607-9952-56531d85d308" (UID: "ba1c346d-612f-4607-9952-56531d85d308"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.513830 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba1c346d-612f-4607-9952-56531d85d308-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.527312 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" event={"ID":"25d27288-ba82-4c74-a864-b5e54e4be246","Type":"ContainerStarted","Data":"753fd7a837025a213fd68e8ae2200ec3efd98fcc6193b1f184faf0416a532d84"} Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.527994 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.530202 5033 generic.go:334] "Generic (PLEG): container finished" podID="ba1c346d-612f-4607-9952-56531d85d308" containerID="ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860" exitCode=0 Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.530273 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftwvs" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.530286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerDied","Data":"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860"} Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.530344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftwvs" event={"ID":"ba1c346d-612f-4607-9952-56531d85d308","Type":"ContainerDied","Data":"505819775b0bb6bc7c41c235fe1f21146984423a031dc28ea09b7e48d5bd111d"} Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.530375 5033 scope.go:117] "RemoveContainer" containerID="ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.539913 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.564102 5033 scope.go:117] "RemoveContainer" containerID="c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.570922 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-767b88fbc9-6bv6n" podStartSLOduration=2.236414924 podStartE2EDuration="14.570898693s" podCreationTimestamp="2026-03-19 19:08:55 +0000 UTC" firstStartedPulling="2026-03-19 19:08:56.72007881 +0000 UTC m=+746.825108659" lastFinishedPulling="2026-03-19 19:09:09.054562589 +0000 UTC m=+759.159592428" observedRunningTime="2026-03-19 19:09:09.563282767 +0000 UTC m=+759.668312636" watchObservedRunningTime="2026-03-19 19:09:09.570898693 +0000 UTC m=+759.675928542" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.593983 5033 scope.go:117] "RemoveContainer" containerID="c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.610943 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.616694 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ftwvs"] Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.616961 5033 scope.go:117] "RemoveContainer" containerID="ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860" Mar 19 19:09:09 crc kubenswrapper[5033]: E0319 19:09:09.617347 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860\": container with ID starting with ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860 not found: ID does not exist" containerID="ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.617373 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860"} err="failed to get container status \"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860\": rpc error: code = NotFound desc = could not find container \"ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860\": container with ID starting with ab7c2f12300d4b70440353c6476cc4824f3f306b12844bf1772fcda27b863860 not found: ID does not exist" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.617391 5033 scope.go:117] "RemoveContainer" containerID="c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11" Mar 19 19:09:09 crc kubenswrapper[5033]: E0319 19:09:09.617599 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11\": container with ID starting with c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11 not found: ID does not exist" containerID="c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.617618 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11"} err="failed to get container status \"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11\": rpc error: code = NotFound desc = could not find container \"c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11\": container with ID starting with c3252cd318f8762fae3ec7cdea2c838ffcc8c678199f1ec0b0b91c4035187a11 not found: ID does not exist" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.617635 5033 scope.go:117] "RemoveContainer" containerID="c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec" Mar 19 19:09:09 crc kubenswrapper[5033]: E0319 19:09:09.617798 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec\": container with ID starting with c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec not found: ID does not exist" containerID="c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec" Mar 19 19:09:09 crc kubenswrapper[5033]: I0319 19:09:09.617811 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec"} err="failed to get container status \"c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec\": rpc error: code = NotFound desc = could not find container \"c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec\": container with ID starting with c9fb35b5b45f18a8cee1d4e752610a7ce8486b5006bed33081eaa13f0fa60eec not found: ID does not exist" Mar 19 19:09:10 crc kubenswrapper[5033]: I0319 19:09:10.627627 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba1c346d-612f-4607-9952-56531d85d308" path="/var/lib/kubelet/pods/ba1c346d-612f-4607-9952-56531d85d308/volumes" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.725216 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn"] Mar 19 19:09:40 crc kubenswrapper[5033]: E0319 19:09:40.726027 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="extract-utilities" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.726041 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="extract-utilities" Mar 19 19:09:40 crc kubenswrapper[5033]: E0319 19:09:40.726064 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="extract-content" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.726071 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="extract-content" Mar 19 19:09:40 crc kubenswrapper[5033]: E0319 19:09:40.726081 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="registry-server" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.726090 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="registry-server" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.726238 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1c346d-612f-4607-9952-56531d85d308" containerName="registry-server" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.727196 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.731294 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.742910 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn"] Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.815171 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.815388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.815426 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxh9q\" (UniqueName: \"kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.916722 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.916766 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxh9q\" (UniqueName: \"kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.916832 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.917272 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.917297 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:40 crc kubenswrapper[5033]: I0319 19:09:40.941320 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxh9q\" (UniqueName: \"kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:41 crc kubenswrapper[5033]: I0319 19:09:41.050316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:41 crc kubenswrapper[5033]: I0319 19:09:41.488295 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn"] Mar 19 19:09:41 crc kubenswrapper[5033]: I0319 19:09:41.731582 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerID="7bb80b511d876a502b92d71e3737cf3888bb87ba86d70d1a5b3488b1fed80075" exitCode=0 Mar 19 19:09:41 crc kubenswrapper[5033]: I0319 19:09:41.731641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" event={"ID":"8e1f452b-54aa-4321-99d5-ebdf6991379a","Type":"ContainerDied","Data":"7bb80b511d876a502b92d71e3737cf3888bb87ba86d70d1a5b3488b1fed80075"} Mar 19 19:09:41 crc kubenswrapper[5033]: I0319 19:09:41.731868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" event={"ID":"8e1f452b-54aa-4321-99d5-ebdf6991379a","Type":"ContainerStarted","Data":"c2f83d7a58bd0e6472193437a84f1815de42d454ee74d985aaa2c56d49c2af02"} Mar 19 19:09:43 crc kubenswrapper[5033]: I0319 19:09:43.746177 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerID="de2dbca7c9bc579080ee670f3f7ce29d89cbead2212a86792a2b51143d3e6b87" exitCode=0 Mar 19 19:09:43 crc kubenswrapper[5033]: I0319 19:09:43.746260 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" event={"ID":"8e1f452b-54aa-4321-99d5-ebdf6991379a","Type":"ContainerDied","Data":"de2dbca7c9bc579080ee670f3f7ce29d89cbead2212a86792a2b51143d3e6b87"} Mar 19 19:09:44 crc kubenswrapper[5033]: I0319 19:09:44.755654 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerID="5729a3ff50feb95ecf555a83fb1ea92151df8c4fa90ae983cbec0fc36489d29e" exitCode=0 Mar 19 19:09:44 crc kubenswrapper[5033]: I0319 19:09:44.755710 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" event={"ID":"8e1f452b-54aa-4321-99d5-ebdf6991379a","Type":"ContainerDied","Data":"5729a3ff50feb95ecf555a83fb1ea92151df8c4fa90ae983cbec0fc36489d29e"} Mar 19 19:09:45 crc kubenswrapper[5033]: I0319 19:09:45.974114 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:45.983138 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util\") pod \"8e1f452b-54aa-4321-99d5-ebdf6991379a\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:45.983195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle\") pod \"8e1f452b-54aa-4321-99d5-ebdf6991379a\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:45.983245 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxh9q\" (UniqueName: \"kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q\") pod \"8e1f452b-54aa-4321-99d5-ebdf6991379a\" (UID: \"8e1f452b-54aa-4321-99d5-ebdf6991379a\") " Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:45.984369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle" (OuterVolumeSpecName: "bundle") pod "8e1f452b-54aa-4321-99d5-ebdf6991379a" (UID: "8e1f452b-54aa-4321-99d5-ebdf6991379a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:45.990444 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q" (OuterVolumeSpecName: "kube-api-access-bxh9q") pod "8e1f452b-54aa-4321-99d5-ebdf6991379a" (UID: "8e1f452b-54aa-4321-99d5-ebdf6991379a"). InnerVolumeSpecName "kube-api-access-bxh9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.002736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util" (OuterVolumeSpecName: "util") pod "8e1f452b-54aa-4321-99d5-ebdf6991379a" (UID: "8e1f452b-54aa-4321-99d5-ebdf6991379a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.085405 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxh9q\" (UniqueName: \"kubernetes.io/projected/8e1f452b-54aa-4321-99d5-ebdf6991379a-kube-api-access-bxh9q\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.085746 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.085761 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1f452b-54aa-4321-99d5-ebdf6991379a-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.772694 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" event={"ID":"8e1f452b-54aa-4321-99d5-ebdf6991379a","Type":"ContainerDied","Data":"c2f83d7a58bd0e6472193437a84f1815de42d454ee74d985aaa2c56d49c2af02"} Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.772731 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f83d7a58bd0e6472193437a84f1815de42d454ee74d985aaa2c56d49c2af02" Mar 19 19:09:46 crc kubenswrapper[5033]: I0319 19:09:46.772780 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.763217 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-d97f6"] Mar 19 19:09:52 crc kubenswrapper[5033]: E0319 19:09:52.765701 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="pull" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.765734 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="pull" Mar 19 19:09:52 crc kubenswrapper[5033]: E0319 19:09:52.765766 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="extract" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.765781 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="extract" Mar 19 19:09:52 crc kubenswrapper[5033]: E0319 19:09:52.765819 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="util" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.765832 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="util" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.767100 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1f452b-54aa-4321-99d5-ebdf6991379a" containerName="extract" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.769344 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.772752 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.773137 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.773409 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bpptc" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.812036 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-d97f6"] Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.878777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2s2f\" (UniqueName: \"kubernetes.io/projected/51881d08-f38e-4817-b7d6-942913bff182-kube-api-access-j2s2f\") pod \"nmstate-operator-796d4cfff4-d97f6\" (UID: \"51881d08-f38e-4817-b7d6-942913bff182\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" Mar 19 19:09:52 crc kubenswrapper[5033]: I0319 19:09:52.980807 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2s2f\" (UniqueName: \"kubernetes.io/projected/51881d08-f38e-4817-b7d6-942913bff182-kube-api-access-j2s2f\") pod \"nmstate-operator-796d4cfff4-d97f6\" (UID: \"51881d08-f38e-4817-b7d6-942913bff182\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" Mar 19 19:09:53 crc kubenswrapper[5033]: I0319 19:09:53.000361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2s2f\" (UniqueName: \"kubernetes.io/projected/51881d08-f38e-4817-b7d6-942913bff182-kube-api-access-j2s2f\") pod \"nmstate-operator-796d4cfff4-d97f6\" (UID: \"51881d08-f38e-4817-b7d6-942913bff182\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" Mar 19 19:09:53 crc kubenswrapper[5033]: I0319 19:09:53.123603 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" Mar 19 19:09:53 crc kubenswrapper[5033]: I0319 19:09:53.335916 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-d97f6"] Mar 19 19:09:53 crc kubenswrapper[5033]: I0319 19:09:53.830321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" event={"ID":"51881d08-f38e-4817-b7d6-942913bff182","Type":"ContainerStarted","Data":"bdf15eb7e93ed69d9fae54de631162b90f068da1c136d8cf6d55a2bc9d1cc6c2"} Mar 19 19:09:55 crc kubenswrapper[5033]: I0319 19:09:55.843778 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" event={"ID":"51881d08-f38e-4817-b7d6-942913bff182","Type":"ContainerStarted","Data":"1fee8fce42731952f55ae5c0561d1d9077d85f8abfb3b04babbbdc508fd361fd"} Mar 19 19:09:55 crc kubenswrapper[5033]: I0319 19:09:55.867873 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-d97f6" podStartSLOduration=1.5609143699999999 podStartE2EDuration="3.867854895s" podCreationTimestamp="2026-03-19 19:09:52 +0000 UTC" firstStartedPulling="2026-03-19 19:09:53.344698993 +0000 UTC m=+803.449728852" lastFinishedPulling="2026-03-19 19:09:55.651639528 +0000 UTC m=+805.756669377" observedRunningTime="2026-03-19 19:09:55.866327925 +0000 UTC m=+805.971357774" watchObservedRunningTime="2026-03-19 19:09:55.867854895 +0000 UTC m=+805.972884734" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.145108 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565790-tvvt6"] Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.147783 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.150994 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.151355 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.154263 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.158380 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-tvvt6"] Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.268564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jh5l\" (UniqueName: \"kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l\") pod \"auto-csr-approver-29565790-tvvt6\" (UID: \"eee59143-3db6-4ba6-95d0-2017f1fa65e0\") " pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.370010 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jh5l\" (UniqueName: \"kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l\") pod \"auto-csr-approver-29565790-tvvt6\" (UID: \"eee59143-3db6-4ba6-95d0-2017f1fa65e0\") " pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.404179 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jh5l\" (UniqueName: \"kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l\") pod \"auto-csr-approver-29565790-tvvt6\" (UID: \"eee59143-3db6-4ba6-95d0-2017f1fa65e0\") " pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.475136 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.866303 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-tvvt6"] Mar 19 19:10:00 crc kubenswrapper[5033]: I0319 19:10:00.884108 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" event={"ID":"eee59143-3db6-4ba6-95d0-2017f1fa65e0","Type":"ContainerStarted","Data":"33cc45983542fb8b76e8e91a1427caaaa3bc190fadd3c0a6ae0b48ffd354d67a"} Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.732215 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.733357 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.735258 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-cfqx6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.741929 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.742843 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.744378 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.746202 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.751577 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.787532 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-2z9z2"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.788394 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.865777 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.866459 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.868522 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.879903 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.879955 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-knlnc" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.884707 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr"] Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.888926 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-ovs-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889015 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62nkq\" (UniqueName: \"kubernetes.io/projected/6687aa22-fc81-4b3b-a810-29da61fff408-kube-api-access-62nkq\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889054 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lsbj\" (UniqueName: \"kubernetes.io/projected/cacfaf75-f7b6-4ca4-ad40-661224a27fad-kube-api-access-5lsbj\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889080 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cacfaf75-f7b6-4ca4-ad40-661224a27fad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/06756e34-c004-4dc7-85ca-75b9c0b8ea15-kube-api-access-v2l2c\") pod \"nmstate-metrics-9b8c8685d-vpctf\" (UID: \"06756e34-c004-4dc7-85ca-75b9c0b8ea15\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889145 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-dbus-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.889182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-nmstate-lock\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990060 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-ovs-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e3fdc-a192-4b4c-830c-bfb94179eed7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990163 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nkq\" (UniqueName: \"kubernetes.io/projected/6687aa22-fc81-4b3b-a810-29da61fff408-kube-api-access-62nkq\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/b45e3fdc-a192-4b4c-830c-bfb94179eed7-kube-api-access-vpwr7\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990206 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lsbj\" (UniqueName: \"kubernetes.io/projected/cacfaf75-f7b6-4ca4-ad40-661224a27fad-kube-api-access-5lsbj\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990204 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-ovs-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990221 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cacfaf75-f7b6-4ca4-ad40-661224a27fad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990294 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/06756e34-c004-4dc7-85ca-75b9c0b8ea15-kube-api-access-v2l2c\") pod \"nmstate-metrics-9b8c8685d-vpctf\" (UID: \"06756e34-c004-4dc7-85ca-75b9c0b8ea15\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-dbus-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990524 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-nmstate-lock\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b45e3fdc-a192-4b4c-830c-bfb94179eed7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990613 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-nmstate-lock\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.990701 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6687aa22-fc81-4b3b-a810-29da61fff408-dbus-socket\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:01 crc kubenswrapper[5033]: I0319 19:10:01.998723 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cacfaf75-f7b6-4ca4-ad40-661224a27fad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.009313 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lsbj\" (UniqueName: \"kubernetes.io/projected/cacfaf75-f7b6-4ca4-ad40-661224a27fad-kube-api-access-5lsbj\") pod \"nmstate-webhook-5f558f5558-7fsr6\" (UID: \"cacfaf75-f7b6-4ca4-ad40-661224a27fad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.011591 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/06756e34-c004-4dc7-85ca-75b9c0b8ea15-kube-api-access-v2l2c\") pod \"nmstate-metrics-9b8c8685d-vpctf\" (UID: \"06756e34-c004-4dc7-85ca-75b9c0b8ea15\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.011791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nkq\" (UniqueName: \"kubernetes.io/projected/6687aa22-fc81-4b3b-a810-29da61fff408-kube-api-access-62nkq\") pod \"nmstate-handler-2z9z2\" (UID: \"6687aa22-fc81-4b3b-a810-29da61fff408\") " pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.050087 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cbcc6cd47-9j59k"] Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.050939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.062144 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cbcc6cd47-9j59k"] Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.069231 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.078188 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.091277 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b45e3fdc-a192-4b4c-830c-bfb94179eed7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.091341 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e3fdc-a192-4b4c-830c-bfb94179eed7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.091364 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/b45e3fdc-a192-4b4c-830c-bfb94179eed7-kube-api-access-vpwr7\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.092216 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b45e3fdc-a192-4b4c-830c-bfb94179eed7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.096211 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e3fdc-a192-4b4c-830c-bfb94179eed7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.111484 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.116166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwr7\" (UniqueName: \"kubernetes.io/projected/b45e3fdc-a192-4b4c-830c-bfb94179eed7-kube-api-access-vpwr7\") pod \"nmstate-console-plugin-86f58fcf4-vvqkr\" (UID: \"b45e3fdc-a192-4b4c-830c-bfb94179eed7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: W0319 19:10:02.128425 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6687aa22_fc81_4b3b_a810_29da61fff408.slice/crio-addb844ebb1b5be5525813c452de2ecdcce313630057c71fddb8d2b0984a6ccc WatchSource:0}: Error finding container addb844ebb1b5be5525813c452de2ecdcce313630057c71fddb8d2b0984a6ccc: Status 404 returned error can't find the container with id addb844ebb1b5be5525813c452de2ecdcce313630057c71fddb8d2b0984a6ccc Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.178348 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc66p\" (UniqueName: \"kubernetes.io/projected/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-kube-api-access-nc66p\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-oauth-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193666 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-oauth-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193709 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-service-ca\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-trusted-ca-bundle\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.193774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-oauth-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295279 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc66p\" (UniqueName: \"kubernetes.io/projected/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-kube-api-access-nc66p\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295352 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-oauth-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-service-ca\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295414 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-trusted-ca-bundle\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.295436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.296140 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.297368 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-service-ca\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.298198 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-trusted-ca-bundle\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.302293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-oauth-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.304400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-oauth-config\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.305706 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-console-serving-cert\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.314302 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc66p\" (UniqueName: \"kubernetes.io/projected/8eda24e7-ffc1-470a-9bf5-1bc9730a52b3-kube-api-access-nc66p\") pod \"console-cbcc6cd47-9j59k\" (UID: \"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3\") " pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.364529 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.528870 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf"] Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.547793 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6"] Mar 19 19:10:02 crc kubenswrapper[5033]: W0319 19:10:02.578785 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcacfaf75_f7b6_4ca4_ad40_661224a27fad.slice/crio-75d9921b8c826ddd2a62891288fe13b03f802f33490d558b975714bd483947da WatchSource:0}: Error finding container 75d9921b8c826ddd2a62891288fe13b03f802f33490d558b975714bd483947da: Status 404 returned error can't find the container with id 75d9921b8c826ddd2a62891288fe13b03f802f33490d558b975714bd483947da Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.629067 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr"] Mar 19 19:10:02 crc kubenswrapper[5033]: W0319 19:10:02.637359 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45e3fdc_a192_4b4c_830c_bfb94179eed7.slice/crio-86b4ba251bba471fc42d4f0198898651a9956cfe27de3f15c5a5e009ec3b75ae WatchSource:0}: Error finding container 86b4ba251bba471fc42d4f0198898651a9956cfe27de3f15c5a5e009ec3b75ae: Status 404 returned error can't find the container with id 86b4ba251bba471fc42d4f0198898651a9956cfe27de3f15c5a5e009ec3b75ae Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.669214 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cbcc6cd47-9j59k"] Mar 19 19:10:02 crc kubenswrapper[5033]: W0319 19:10:02.671180 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eda24e7_ffc1_470a_9bf5_1bc9730a52b3.slice/crio-9815aa233bf95ed9a61acddd3220e026fed1083dd40151b61d786cc58bb9fd73 WatchSource:0}: Error finding container 9815aa233bf95ed9a61acddd3220e026fed1083dd40151b61d786cc58bb9fd73: Status 404 returned error can't find the container with id 9815aa233bf95ed9a61acddd3220e026fed1083dd40151b61d786cc58bb9fd73 Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.896000 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" event={"ID":"06756e34-c004-4dc7-85ca-75b9c0b8ea15","Type":"ContainerStarted","Data":"cf97f374428d39effd11c886ebf4cb61b9e18d4239f3b598efa7efa2e5869bef"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.897182 5033 generic.go:334] "Generic (PLEG): container finished" podID="eee59143-3db6-4ba6-95d0-2017f1fa65e0" containerID="3164491618e21e4463d0c4200061cbb3c985e29e5355e7be7299b32234df048f" exitCode=0 Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.897224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" event={"ID":"eee59143-3db6-4ba6-95d0-2017f1fa65e0","Type":"ContainerDied","Data":"3164491618e21e4463d0c4200061cbb3c985e29e5355e7be7299b32234df048f"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.897990 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" event={"ID":"b45e3fdc-a192-4b4c-830c-bfb94179eed7","Type":"ContainerStarted","Data":"86b4ba251bba471fc42d4f0198898651a9956cfe27de3f15c5a5e009ec3b75ae"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.898665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" event={"ID":"cacfaf75-f7b6-4ca4-ad40-661224a27fad","Type":"ContainerStarted","Data":"75d9921b8c826ddd2a62891288fe13b03f802f33490d558b975714bd483947da"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.899557 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2z9z2" event={"ID":"6687aa22-fc81-4b3b-a810-29da61fff408","Type":"ContainerStarted","Data":"addb844ebb1b5be5525813c452de2ecdcce313630057c71fddb8d2b0984a6ccc"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.900904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbcc6cd47-9j59k" event={"ID":"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3","Type":"ContainerStarted","Data":"b6bb4bd92d6ed234ac26d78d9756fa11ae2776a924ba5fd9e3ca68047117ab07"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.900932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbcc6cd47-9j59k" event={"ID":"8eda24e7-ffc1-470a-9bf5-1bc9730a52b3","Type":"ContainerStarted","Data":"9815aa233bf95ed9a61acddd3220e026fed1083dd40151b61d786cc58bb9fd73"} Mar 19 19:10:02 crc kubenswrapper[5033]: I0319 19:10:02.925221 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cbcc6cd47-9j59k" podStartSLOduration=0.925203013 podStartE2EDuration="925.203013ms" podCreationTimestamp="2026-03-19 19:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:10:02.921937229 +0000 UTC m=+813.026967078" watchObservedRunningTime="2026-03-19 19:10:02.925203013 +0000 UTC m=+813.030232862" Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.193439 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.327260 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jh5l\" (UniqueName: \"kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l\") pod \"eee59143-3db6-4ba6-95d0-2017f1fa65e0\" (UID: \"eee59143-3db6-4ba6-95d0-2017f1fa65e0\") " Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.333045 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l" (OuterVolumeSpecName: "kube-api-access-7jh5l") pod "eee59143-3db6-4ba6-95d0-2017f1fa65e0" (UID: "eee59143-3db6-4ba6-95d0-2017f1fa65e0"). InnerVolumeSpecName "kube-api-access-7jh5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.428418 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jh5l\" (UniqueName: \"kubernetes.io/projected/eee59143-3db6-4ba6-95d0-2017f1fa65e0-kube-api-access-7jh5l\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.924362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" event={"ID":"eee59143-3db6-4ba6-95d0-2017f1fa65e0","Type":"ContainerDied","Data":"33cc45983542fb8b76e8e91a1427caaaa3bc190fadd3c0a6ae0b48ffd354d67a"} Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.924827 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33cc45983542fb8b76e8e91a1427caaaa3bc190fadd3c0a6ae0b48ffd354d67a" Mar 19 19:10:04 crc kubenswrapper[5033]: I0319 19:10:04.924425 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-tvvt6" Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.269319 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-n2knf"] Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.274909 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-n2knf"] Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.933044 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2z9z2" event={"ID":"6687aa22-fc81-4b3b-a810-29da61fff408","Type":"ContainerStarted","Data":"c341a33fac81181ea77d012f4d8946bef9f72670339ba2478ba94f75bf4289cb"} Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.933384 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.934867 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" event={"ID":"06756e34-c004-4dc7-85ca-75b9c0b8ea15","Type":"ContainerStarted","Data":"39efce905a48af32a1d878062bc31154bd3ab77cc2834b2c1c04efc4b0a2a99b"} Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.936289 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" event={"ID":"cacfaf75-f7b6-4ca4-ad40-661224a27fad","Type":"ContainerStarted","Data":"30b3a9778932052694feec3154648e02602b3e299b780e9631b7468d95efd502"} Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.936411 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.949972 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-2z9z2" podStartSLOduration=1.5078351479999998 podStartE2EDuration="4.949954974s" podCreationTimestamp="2026-03-19 19:10:01 +0000 UTC" firstStartedPulling="2026-03-19 19:10:02.13071991 +0000 UTC m=+812.235749759" lastFinishedPulling="2026-03-19 19:10:05.572839736 +0000 UTC m=+815.677869585" observedRunningTime="2026-03-19 19:10:05.947644304 +0000 UTC m=+816.052674183" watchObservedRunningTime="2026-03-19 19:10:05.949954974 +0000 UTC m=+816.054984813" Mar 19 19:10:05 crc kubenswrapper[5033]: I0319 19:10:05.963382 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" podStartSLOduration=2.036832042 podStartE2EDuration="4.963361609s" podCreationTimestamp="2026-03-19 19:10:01 +0000 UTC" firstStartedPulling="2026-03-19 19:10:02.586099866 +0000 UTC m=+812.691129705" lastFinishedPulling="2026-03-19 19:10:05.512629423 +0000 UTC m=+815.617659272" observedRunningTime="2026-03-19 19:10:05.963027891 +0000 UTC m=+816.068057740" watchObservedRunningTime="2026-03-19 19:10:05.963361609 +0000 UTC m=+816.068391458" Mar 19 19:10:06 crc kubenswrapper[5033]: I0319 19:10:06.628696 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989558ba-c11c-4f21-8354-aa0d39e841f1" path="/var/lib/kubelet/pods/989558ba-c11c-4f21-8354-aa0d39e841f1/volumes" Mar 19 19:10:06 crc kubenswrapper[5033]: I0319 19:10:06.944336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" event={"ID":"b45e3fdc-a192-4b4c-830c-bfb94179eed7","Type":"ContainerStarted","Data":"5076a4c92b28eafd7f6ae7096fd7c70913c6acd4953c8722073c18105e9a17d7"} Mar 19 19:10:06 crc kubenswrapper[5033]: I0319 19:10:06.962693 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vvqkr" podStartSLOduration=1.871414025 podStartE2EDuration="5.962676766s" podCreationTimestamp="2026-03-19 19:10:01 +0000 UTC" firstStartedPulling="2026-03-19 19:10:02.640624322 +0000 UTC m=+812.745654161" lastFinishedPulling="2026-03-19 19:10:06.731887053 +0000 UTC m=+816.836916902" observedRunningTime="2026-03-19 19:10:06.962367878 +0000 UTC m=+817.067397727" watchObservedRunningTime="2026-03-19 19:10:06.962676766 +0000 UTC m=+817.067706615" Mar 19 19:10:08 crc kubenswrapper[5033]: I0319 19:10:08.960120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" event={"ID":"06756e34-c004-4dc7-85ca-75b9c0b8ea15","Type":"ContainerStarted","Data":"beaa316c63f7d04009f6686181acbaffa6d0bf11bdd56ba3a29e9ce5c914788c"} Mar 19 19:10:08 crc kubenswrapper[5033]: I0319 19:10:08.977393 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vpctf" podStartSLOduration=2.001094211 podStartE2EDuration="7.977368853s" podCreationTimestamp="2026-03-19 19:10:01 +0000 UTC" firstStartedPulling="2026-03-19 19:10:02.546228198 +0000 UTC m=+812.651258047" lastFinishedPulling="2026-03-19 19:10:08.52250283 +0000 UTC m=+818.627532689" observedRunningTime="2026-03-19 19:10:08.974981781 +0000 UTC m=+819.080011650" watchObservedRunningTime="2026-03-19 19:10:08.977368853 +0000 UTC m=+819.082398742" Mar 19 19:10:12 crc kubenswrapper[5033]: I0319 19:10:12.150427 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-2z9z2" Mar 19 19:10:12 crc kubenswrapper[5033]: I0319 19:10:12.365726 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:12 crc kubenswrapper[5033]: I0319 19:10:12.365827 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:12 crc kubenswrapper[5033]: I0319 19:10:12.371840 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:12 crc kubenswrapper[5033]: I0319 19:10:12.991000 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cbcc6cd47-9j59k" Mar 19 19:10:13 crc kubenswrapper[5033]: I0319 19:10:13.051792 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 19:10:22 crc kubenswrapper[5033]: I0319 19:10:22.088207 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7fsr6" Mar 19 19:10:31 crc kubenswrapper[5033]: I0319 19:10:31.209058 5033 scope.go:117] "RemoveContainer" containerID="1465a9be8a5f6497878a78076b336a2b606a5a9d7ff442938e8f60aee0bd92a4" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.066370 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf"] Mar 19 19:10:35 crc kubenswrapper[5033]: E0319 19:10:35.068752 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee59143-3db6-4ba6-95d0-2017f1fa65e0" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.068872 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee59143-3db6-4ba6-95d0-2017f1fa65e0" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.069116 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee59143-3db6-4ba6-95d0-2017f1fa65e0" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.070230 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.074895 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.079400 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf"] Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.261399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.261469 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.261504 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nddnp\" (UniqueName: \"kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.363247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nddnp\" (UniqueName: \"kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.363388 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.363422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.363978 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.364652 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.393554 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nddnp\" (UniqueName: \"kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.447583 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:35 crc kubenswrapper[5033]: I0319 19:10:35.872251 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf"] Mar 19 19:10:36 crc kubenswrapper[5033]: I0319 19:10:36.343103 5033 generic.go:334] "Generic (PLEG): container finished" podID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerID="73c6fd09f69131e5ebf82a016dd04a40d96dd3acda14af0adf4bc04008145f7f" exitCode=0 Mar 19 19:10:36 crc kubenswrapper[5033]: I0319 19:10:36.343336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" event={"ID":"d25c425c-248a-4677-be0a-91a0cd1ccf41","Type":"ContainerDied","Data":"73c6fd09f69131e5ebf82a016dd04a40d96dd3acda14af0adf4bc04008145f7f"} Mar 19 19:10:36 crc kubenswrapper[5033]: I0319 19:10:36.343361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" event={"ID":"d25c425c-248a-4677-be0a-91a0cd1ccf41","Type":"ContainerStarted","Data":"c7f94c86130746ad5efaa66436ad94c552aaa836f2b83c8c7d65ca05af19be0e"} Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.179354 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5jw55" podUID="9837c3e1-e614-408e-8914-c1390367407f" containerName="console" containerID="cri-o://2d5ad4bf99ff8b7260e0c685b4a4dfc5f50064566ad5715b7ca4dc626b378755" gracePeriod=15 Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.358648 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5jw55_9837c3e1-e614-408e-8914-c1390367407f/console/0.log" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.358917 5033 generic.go:334] "Generic (PLEG): container finished" podID="9837c3e1-e614-408e-8914-c1390367407f" containerID="2d5ad4bf99ff8b7260e0c685b4a4dfc5f50064566ad5715b7ca4dc626b378755" exitCode=2 Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.358950 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jw55" event={"ID":"9837c3e1-e614-408e-8914-c1390367407f","Type":"ContainerDied","Data":"2d5ad4bf99ff8b7260e0c685b4a4dfc5f50064566ad5715b7ca4dc626b378755"} Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.563317 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5jw55_9837c3e1-e614-408e-8914-c1390367407f/console/0.log" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.563380 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705359 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqf8d\" (UniqueName: \"kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705474 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705526 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705562 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705596 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.705625 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.706171 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.706245 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config" (OuterVolumeSpecName: "console-config") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.706271 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.706792 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.706442 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca\") pod \"9837c3e1-e614-408e-8914-c1390367407f\" (UID: \"9837c3e1-e614-408e-8914-c1390367407f\") " Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.717601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d" (OuterVolumeSpecName: "kube-api-access-vqf8d") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "kube-api-access-vqf8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718259 5033 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718297 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718308 5033 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718321 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9837c3e1-e614-408e-8914-c1390367407f-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718337 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqf8d\" (UniqueName: \"kubernetes.io/projected/9837c3e1-e614-408e-8914-c1390367407f-kube-api-access-vqf8d\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718549 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.718619 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9837c3e1-e614-408e-8914-c1390367407f" (UID: "9837c3e1-e614-408e-8914-c1390367407f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.819565 5033 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:38 crc kubenswrapper[5033]: I0319 19:10:38.819608 5033 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9837c3e1-e614-408e-8914-c1390367407f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.367039 5033 generic.go:334] "Generic (PLEG): container finished" podID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerID="0c70aa7760d44927e59b203d4c899341ca7d1f73a85a80ef9ffab124fe1cde44" exitCode=0 Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.367149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" event={"ID":"d25c425c-248a-4677-be0a-91a0cd1ccf41","Type":"ContainerDied","Data":"0c70aa7760d44927e59b203d4c899341ca7d1f73a85a80ef9ffab124fe1cde44"} Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.370505 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5jw55_9837c3e1-e614-408e-8914-c1390367407f/console/0.log" Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.370581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jw55" event={"ID":"9837c3e1-e614-408e-8914-c1390367407f","Type":"ContainerDied","Data":"9d0f68dbfa554d3e422a582c0a3e3cf34c3299cd251d5898729ea11a79024ee5"} Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.370632 5033 scope.go:117] "RemoveContainer" containerID="2d5ad4bf99ff8b7260e0c685b4a4dfc5f50064566ad5715b7ca4dc626b378755" Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.370719 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jw55" Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.416583 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 19:10:39 crc kubenswrapper[5033]: I0319 19:10:39.420144 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5jw55"] Mar 19 19:10:40 crc kubenswrapper[5033]: I0319 19:10:40.380089 5033 generic.go:334] "Generic (PLEG): container finished" podID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerID="5072a5b3d875d761e0919650a3969e2fc0223804107b78031b15415c16f38cf7" exitCode=0 Mar 19 19:10:40 crc kubenswrapper[5033]: I0319 19:10:40.380131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" event={"ID":"d25c425c-248a-4677-be0a-91a0cd1ccf41","Type":"ContainerDied","Data":"5072a5b3d875d761e0919650a3969e2fc0223804107b78031b15415c16f38cf7"} Mar 19 19:10:40 crc kubenswrapper[5033]: I0319 19:10:40.629621 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9837c3e1-e614-408e-8914-c1390367407f" path="/var/lib/kubelet/pods/9837c3e1-e614-408e-8914-c1390367407f/volumes" Mar 19 19:10:40 crc kubenswrapper[5033]: I0319 19:10:40.758912 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:10:40 crc kubenswrapper[5033]: I0319 19:10:40.758968 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.642445 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.800141 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util\") pod \"d25c425c-248a-4677-be0a-91a0cd1ccf41\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.800212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nddnp\" (UniqueName: \"kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp\") pod \"d25c425c-248a-4677-be0a-91a0cd1ccf41\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.800246 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle\") pod \"d25c425c-248a-4677-be0a-91a0cd1ccf41\" (UID: \"d25c425c-248a-4677-be0a-91a0cd1ccf41\") " Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.801839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle" (OuterVolumeSpecName: "bundle") pod "d25c425c-248a-4677-be0a-91a0cd1ccf41" (UID: "d25c425c-248a-4677-be0a-91a0cd1ccf41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.806626 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp" (OuterVolumeSpecName: "kube-api-access-nddnp") pod "d25c425c-248a-4677-be0a-91a0cd1ccf41" (UID: "d25c425c-248a-4677-be0a-91a0cd1ccf41"). InnerVolumeSpecName "kube-api-access-nddnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.809507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util" (OuterVolumeSpecName: "util") pod "d25c425c-248a-4677-be0a-91a0cd1ccf41" (UID: "d25c425c-248a-4677-be0a-91a0cd1ccf41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.901921 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.901954 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nddnp\" (UniqueName: \"kubernetes.io/projected/d25c425c-248a-4677-be0a-91a0cd1ccf41-kube-api-access-nddnp\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:41 crc kubenswrapper[5033]: I0319 19:10:41.901964 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25c425c-248a-4677-be0a-91a0cd1ccf41-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:42 crc kubenswrapper[5033]: I0319 19:10:42.395668 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" event={"ID":"d25c425c-248a-4677-be0a-91a0cd1ccf41","Type":"ContainerDied","Data":"c7f94c86130746ad5efaa66436ad94c552aaa836f2b83c8c7d65ca05af19be0e"} Mar 19 19:10:42 crc kubenswrapper[5033]: I0319 19:10:42.395707 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f94c86130746ad5efaa66436ad94c552aaa836f2b83c8c7d65ca05af19be0e" Mar 19 19:10:42 crc kubenswrapper[5033]: I0319 19:10:42.395944 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.627652 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz"] Mar 19 19:10:50 crc kubenswrapper[5033]: E0319 19:10:50.628601 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="extract" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628617 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="extract" Mar 19 19:10:50 crc kubenswrapper[5033]: E0319 19:10:50.628637 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837c3e1-e614-408e-8914-c1390367407f" containerName="console" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628645 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837c3e1-e614-408e-8914-c1390367407f" containerName="console" Mar 19 19:10:50 crc kubenswrapper[5033]: E0319 19:10:50.628665 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="pull" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628673 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="pull" Mar 19 19:10:50 crc kubenswrapper[5033]: E0319 19:10:50.628689 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="util" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628696 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="util" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628816 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25c425c-248a-4677-be0a-91a0cd1ccf41" containerName="extract" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.628839 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9837c3e1-e614-408e-8914-c1390367407f" containerName="console" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.629327 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.633228 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.633429 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.633570 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2zz99" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.633702 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.633880 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.649207 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz"] Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.806095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-apiservice-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.806524 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-webhook-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.806616 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmzzt\" (UniqueName: \"kubernetes.io/projected/7c3f2160-7476-4b83-aa92-cb6064f90495-kube-api-access-wmzzt\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.907426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmzzt\" (UniqueName: \"kubernetes.io/projected/7c3f2160-7476-4b83-aa92-cb6064f90495-kube-api-access-wmzzt\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.907530 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-apiservice-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.907578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-webhook-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.914181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-webhook-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.926528 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c3f2160-7476-4b83-aa92-cb6064f90495-apiservice-cert\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.927034 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmzzt\" (UniqueName: \"kubernetes.io/projected/7c3f2160-7476-4b83-aa92-cb6064f90495-kube-api-access-wmzzt\") pod \"metallb-operator-controller-manager-5db5b6db5f-r8dbz\" (UID: \"7c3f2160-7476-4b83-aa92-cb6064f90495\") " pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:50 crc kubenswrapper[5033]: I0319 19:10:50.983398 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.046665 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp"] Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.047330 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.050921 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.051072 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dw7q4" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.051132 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.062431 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp"] Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.210671 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-webhook-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.210789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-apiservice-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.210817 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzng\" (UniqueName: \"kubernetes.io/projected/edc8556f-ba2e-4280-bab3-5778367b0c75-kube-api-access-7tzng\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.311736 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-apiservice-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.311780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzng\" (UniqueName: \"kubernetes.io/projected/edc8556f-ba2e-4280-bab3-5778367b0c75-kube-api-access-7tzng\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.311843 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-webhook-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.315916 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-apiservice-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.315991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/edc8556f-ba2e-4280-bab3-5778367b0c75-webhook-cert\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.326584 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzng\" (UniqueName: \"kubernetes.io/projected/edc8556f-ba2e-4280-bab3-5778367b0c75-kube-api-access-7tzng\") pod \"metallb-operator-webhook-server-7fdfd57896-smckp\" (UID: \"edc8556f-ba2e-4280-bab3-5778367b0c75\") " pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.367706 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.483661 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz"] Mar 19 19:10:51 crc kubenswrapper[5033]: W0319 19:10:51.508966 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3f2160_7476_4b83_aa92_cb6064f90495.slice/crio-1711a20595ab228728cd91cb2b45af5b587fce2dda1a31f3686f6108cb1363eb WatchSource:0}: Error finding container 1711a20595ab228728cd91cb2b45af5b587fce2dda1a31f3686f6108cb1363eb: Status 404 returned error can't find the container with id 1711a20595ab228728cd91cb2b45af5b587fce2dda1a31f3686f6108cb1363eb Mar 19 19:10:51 crc kubenswrapper[5033]: I0319 19:10:51.837172 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp"] Mar 19 19:10:51 crc kubenswrapper[5033]: W0319 19:10:51.850241 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedc8556f_ba2e_4280_bab3_5778367b0c75.slice/crio-8028f8d7276e39ed95107b41b3478d8d7673bad46259ebe71d2fa24f2cf2e22f WatchSource:0}: Error finding container 8028f8d7276e39ed95107b41b3478d8d7673bad46259ebe71d2fa24f2cf2e22f: Status 404 returned error can't find the container with id 8028f8d7276e39ed95107b41b3478d8d7673bad46259ebe71d2fa24f2cf2e22f Mar 19 19:10:52 crc kubenswrapper[5033]: I0319 19:10:52.450416 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" event={"ID":"edc8556f-ba2e-4280-bab3-5778367b0c75","Type":"ContainerStarted","Data":"8028f8d7276e39ed95107b41b3478d8d7673bad46259ebe71d2fa24f2cf2e22f"} Mar 19 19:10:52 crc kubenswrapper[5033]: I0319 19:10:52.451764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" event={"ID":"7c3f2160-7476-4b83-aa92-cb6064f90495","Type":"ContainerStarted","Data":"1711a20595ab228728cd91cb2b45af5b587fce2dda1a31f3686f6108cb1363eb"} Mar 19 19:10:55 crc kubenswrapper[5033]: I0319 19:10:55.472823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" event={"ID":"7c3f2160-7476-4b83-aa92-cb6064f90495","Type":"ContainerStarted","Data":"625b4930958e1ee2f6cac6d48d17c0919ce37964594d569f3fa45acdf04f1360"} Mar 19 19:10:55 crc kubenswrapper[5033]: I0319 19:10:55.474383 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:10:55 crc kubenswrapper[5033]: I0319 19:10:55.491247 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" podStartSLOduration=2.2948136789999998 podStartE2EDuration="5.491229154s" podCreationTimestamp="2026-03-19 19:10:50 +0000 UTC" firstStartedPulling="2026-03-19 19:10:51.517267107 +0000 UTC m=+861.622296956" lastFinishedPulling="2026-03-19 19:10:54.713682572 +0000 UTC m=+864.818712431" observedRunningTime="2026-03-19 19:10:55.486740508 +0000 UTC m=+865.591770357" watchObservedRunningTime="2026-03-19 19:10:55.491229154 +0000 UTC m=+865.596259003" Mar 19 19:10:57 crc kubenswrapper[5033]: I0319 19:10:57.484310 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" event={"ID":"edc8556f-ba2e-4280-bab3-5778367b0c75","Type":"ContainerStarted","Data":"b0bc323c0cc38e19f70603fe3e32e87cebf6115e264af4c2e9de7da923495b9d"} Mar 19 19:10:57 crc kubenswrapper[5033]: I0319 19:10:57.484675 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:10:57 crc kubenswrapper[5033]: I0319 19:10:57.499119 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" podStartSLOduration=1.7494445 podStartE2EDuration="6.499102647s" podCreationTimestamp="2026-03-19 19:10:51 +0000 UTC" firstStartedPulling="2026-03-19 19:10:51.854601955 +0000 UTC m=+861.959631804" lastFinishedPulling="2026-03-19 19:10:56.604260102 +0000 UTC m=+866.709289951" observedRunningTime="2026-03-19 19:10:57.497353447 +0000 UTC m=+867.602383306" watchObservedRunningTime="2026-03-19 19:10:57.499102647 +0000 UTC m=+867.604132496" Mar 19 19:11:10 crc kubenswrapper[5033]: I0319 19:11:10.759216 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:11:10 crc kubenswrapper[5033]: I0319 19:11:10.759831 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:11:11 crc kubenswrapper[5033]: I0319 19:11:11.373171 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7fdfd57896-smckp" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.691722 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.693649 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.703790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.888734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.888826 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.888848 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pthqb\" (UniqueName: \"kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.990090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.990167 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.990184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pthqb\" (UniqueName: \"kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.990573 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:26 crc kubenswrapper[5033]: I0319 19:11:26.990655 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:27 crc kubenswrapper[5033]: I0319 19:11:27.009285 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pthqb\" (UniqueName: \"kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb\") pod \"community-operators-g8k6f\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:27 crc kubenswrapper[5033]: I0319 19:11:27.059358 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:27 crc kubenswrapper[5033]: I0319 19:11:27.506352 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:27 crc kubenswrapper[5033]: I0319 19:11:27.672603 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerStarted","Data":"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8"} Mar 19 19:11:27 crc kubenswrapper[5033]: I0319 19:11:27.672648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerStarted","Data":"757b2f3dabd4eb6ab5161d21a306b442e05660e812e3033589de38cb4e22bc17"} Mar 19 19:11:28 crc kubenswrapper[5033]: I0319 19:11:28.682705 5033 generic.go:334] "Generic (PLEG): container finished" podID="9908e686-0752-455b-a658-027cc16ca41c" containerID="c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8" exitCode=0 Mar 19 19:11:28 crc kubenswrapper[5033]: I0319 19:11:28.682774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerDied","Data":"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8"} Mar 19 19:11:30 crc kubenswrapper[5033]: I0319 19:11:30.897637 5033 generic.go:334] "Generic (PLEG): container finished" podID="9908e686-0752-455b-a658-027cc16ca41c" containerID="d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c" exitCode=0 Mar 19 19:11:30 crc kubenswrapper[5033]: I0319 19:11:30.897757 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerDied","Data":"d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c"} Mar 19 19:11:30 crc kubenswrapper[5033]: I0319 19:11:30.985847 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5db5b6db5f-r8dbz" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.719135 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pdvpg"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.722224 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.724138 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.725719 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.726796 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.727559 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4br84" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.730122 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.735044 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.738113 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.799043 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bn5m8"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.799948 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bn5m8" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.802389 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.802422 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.802392 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dzgb4" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.804368 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.825858 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8f17329-5c04-45bf-9246-fa2aec1e793f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.825914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02d5e27-7333-463b-9144-9fa72da9592c-metrics-certs\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.825950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnxt\" (UniqueName: \"kubernetes.io/projected/e02d5e27-7333-463b-9144-9fa72da9592c-kube-api-access-nqnxt\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.825976 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-metrics\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.825998 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-conf\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.826020 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-reloader\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.826046 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e02d5e27-7333-463b-9144-9fa72da9592c-frr-startup\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.826075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-sockets\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.826102 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m685w\" (UniqueName: \"kubernetes.io/projected/a8f17329-5c04-45bf-9246-fa2aec1e793f-kube-api-access-m685w\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.839442 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-dwtb7"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.851752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.856279 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.860613 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-dwtb7"] Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.905218 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerStarted","Data":"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c"} Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927747 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m685w\" (UniqueName: \"kubernetes.io/projected/a8f17329-5c04-45bf-9246-fa2aec1e793f-kube-api-access-m685w\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927820 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xhh\" (UniqueName: \"kubernetes.io/projected/f1995507-c619-4057-921c-8ca218c4f82e-kube-api-access-x2xhh\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927837 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8f17329-5c04-45bf-9246-fa2aec1e793f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927862 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02d5e27-7333-463b-9144-9fa72da9592c-metrics-certs\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927879 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f1995507-c619-4057-921c-8ca218c4f82e-metallb-excludel2\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnxt\" (UniqueName: \"kubernetes.io/projected/e02d5e27-7333-463b-9144-9fa72da9592c-kube-api-access-nqnxt\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-metrics\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927942 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-conf\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927963 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-reloader\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.927982 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e02d5e27-7333-463b-9144-9fa72da9592c-frr-startup\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.928001 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.928027 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-sockets\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.928379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-sockets\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.928864 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-reloader\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.929059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-frr-conf\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.929360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e02d5e27-7333-463b-9144-9fa72da9592c-metrics\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.930168 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e02d5e27-7333-463b-9144-9fa72da9592c-frr-startup\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.935094 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e02d5e27-7333-463b-9144-9fa72da9592c-metrics-certs\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.935490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8f17329-5c04-45bf-9246-fa2aec1e793f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.946787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnxt\" (UniqueName: \"kubernetes.io/projected/e02d5e27-7333-463b-9144-9fa72da9592c-kube-api-access-nqnxt\") pod \"frr-k8s-pdvpg\" (UID: \"e02d5e27-7333-463b-9144-9fa72da9592c\") " pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:31 crc kubenswrapper[5033]: I0319 19:11:31.960995 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m685w\" (UniqueName: \"kubernetes.io/projected/a8f17329-5c04-45bf-9246-fa2aec1e793f-kube-api-access-m685w\") pod \"frr-k8s-webhook-server-bcc4b6f68-8tmvb\" (UID: \"a8f17329-5c04-45bf-9246-fa2aec1e793f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029362 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqnsz\" (UniqueName: \"kubernetes.io/projected/cb4013d3-e95f-4c07-803c-fef1db499d5f-kube-api-access-fqnsz\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029390 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-cert\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029416 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xhh\" (UniqueName: \"kubernetes.io/projected/f1995507-c619-4057-921c-8ca218c4f82e-kube-api-access-x2xhh\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029462 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-metrics-certs\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.029488 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f1995507-c619-4057-921c-8ca218c4f82e-metallb-excludel2\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.029969 5033 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.030100 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist podName:f1995507-c619-4057-921c-8ca218c4f82e nodeName:}" failed. No retries permitted until 2026-03-19 19:11:32.53007878 +0000 UTC m=+902.635108619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist") pod "speaker-bn5m8" (UID: "f1995507-c619-4057-921c-8ca218c4f82e") : secret "metallb-memberlist" not found Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.030216 5033 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.030295 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs podName:f1995507-c619-4057-921c-8ca218c4f82e nodeName:}" failed. No retries permitted until 2026-03-19 19:11:32.530287195 +0000 UTC m=+902.635317044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs") pod "speaker-bn5m8" (UID: "f1995507-c619-4057-921c-8ca218c4f82e") : secret "speaker-certs-secret" not found Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.030786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f1995507-c619-4057-921c-8ca218c4f82e-metallb-excludel2\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.041362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.047969 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xhh\" (UniqueName: \"kubernetes.io/projected/f1995507-c619-4057-921c-8ca218c4f82e-kube-api-access-x2xhh\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.058326 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.130080 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnsz\" (UniqueName: \"kubernetes.io/projected/cb4013d3-e95f-4c07-803c-fef1db499d5f-kube-api-access-fqnsz\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.130124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-cert\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.130162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-metrics-certs\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.133900 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-metrics-certs\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.133954 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.145309 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb4013d3-e95f-4c07-803c-fef1db499d5f-cert\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.148788 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqnsz\" (UniqueName: \"kubernetes.io/projected/cb4013d3-e95f-4c07-803c-fef1db499d5f-kube-api-access-fqnsz\") pod \"controller-7bb4cc7c98-dwtb7\" (UID: \"cb4013d3-e95f-4c07-803c-fef1db499d5f\") " pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.172120 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.286029 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8k6f" podStartSLOduration=3.608971973 podStartE2EDuration="6.286011915s" podCreationTimestamp="2026-03-19 19:11:26 +0000 UTC" firstStartedPulling="2026-03-19 19:11:28.685211135 +0000 UTC m=+898.790240984" lastFinishedPulling="2026-03-19 19:11:31.362251067 +0000 UTC m=+901.467280926" observedRunningTime="2026-03-19 19:11:31.92707773 +0000 UTC m=+902.032107579" watchObservedRunningTime="2026-03-19 19:11:32.286011915 +0000 UTC m=+902.391041754" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.288966 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb"] Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.535136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.535250 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.535324 5033 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 19:11:32 crc kubenswrapper[5033]: E0319 19:11:32.535407 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist podName:f1995507-c619-4057-921c-8ca218c4f82e nodeName:}" failed. No retries permitted until 2026-03-19 19:11:33.535390757 +0000 UTC m=+903.640420606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist") pod "speaker-bn5m8" (UID: "f1995507-c619-4057-921c-8ca218c4f82e") : secret "metallb-memberlist" not found Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.540045 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-metrics-certs\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:32 crc kubenswrapper[5033]: W0319 19:11:32.584908 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4013d3_e95f_4c07_803c_fef1db499d5f.slice/crio-2583d8c12054a52876114e700b147c7cbf217f6a923902b144fe8196cef91164 WatchSource:0}: Error finding container 2583d8c12054a52876114e700b147c7cbf217f6a923902b144fe8196cef91164: Status 404 returned error can't find the container with id 2583d8c12054a52876114e700b147c7cbf217f6a923902b144fe8196cef91164 Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.586205 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-dwtb7"] Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.911655 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-dwtb7" event={"ID":"cb4013d3-e95f-4c07-803c-fef1db499d5f","Type":"ContainerStarted","Data":"1d4be1248b68d6f9f4adb1bc6531baecde9fcaf993a4db72a9fa70577cecd3ce"} Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.911701 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-dwtb7" event={"ID":"cb4013d3-e95f-4c07-803c-fef1db499d5f","Type":"ContainerStarted","Data":"4d22a55a6d2b5fa733f880dda5518d7f64f6155b88dd40c259ee6a823626133c"} Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.911712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-dwtb7" event={"ID":"cb4013d3-e95f-4c07-803c-fef1db499d5f","Type":"ContainerStarted","Data":"2583d8c12054a52876114e700b147c7cbf217f6a923902b144fe8196cef91164"} Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.911787 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.912963 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"3c8f9710ac713bd35c33603215eafed9a59c3d0e6d989b7c25ed85fa9c43bcfd"} Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.914269 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" event={"ID":"a8f17329-5c04-45bf-9246-fa2aec1e793f","Type":"ContainerStarted","Data":"a655ffb43e9812505cce1c14a3ca6e9f2560c1295906edb97f59e3cf88ae3f76"} Mar 19 19:11:32 crc kubenswrapper[5033]: I0319 19:11:32.928411 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-dwtb7" podStartSLOduration=1.928394092 podStartE2EDuration="1.928394092s" podCreationTimestamp="2026-03-19 19:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:11:32.925767598 +0000 UTC m=+903.030797447" watchObservedRunningTime="2026-03-19 19:11:32.928394092 +0000 UTC m=+903.033423941" Mar 19 19:11:33 crc kubenswrapper[5033]: I0319 19:11:33.546710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:33 crc kubenswrapper[5033]: I0319 19:11:33.551011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f1995507-c619-4057-921c-8ca218c4f82e-memberlist\") pod \"speaker-bn5m8\" (UID: \"f1995507-c619-4057-921c-8ca218c4f82e\") " pod="metallb-system/speaker-bn5m8" Mar 19 19:11:33 crc kubenswrapper[5033]: I0319 19:11:33.625644 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bn5m8" Mar 19 19:11:33 crc kubenswrapper[5033]: I0319 19:11:33.922761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bn5m8" event={"ID":"f1995507-c619-4057-921c-8ca218c4f82e","Type":"ContainerStarted","Data":"4018f0b2a18aee22d731534a07b8f9aca6c361310d1abdca2390e6c1db2dade8"} Mar 19 19:11:34 crc kubenswrapper[5033]: I0319 19:11:34.930303 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bn5m8" event={"ID":"f1995507-c619-4057-921c-8ca218c4f82e","Type":"ContainerStarted","Data":"f3a4534c3bd0a24fe98d2a7c0b3752cc5502b7119967cc441b7212255d2a5a55"} Mar 19 19:11:34 crc kubenswrapper[5033]: I0319 19:11:34.930604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bn5m8" event={"ID":"f1995507-c619-4057-921c-8ca218c4f82e","Type":"ContainerStarted","Data":"f26fe09d1c5505f86cb373b806d8a211c38de1ed5fee5cd141896667a52fdb7f"} Mar 19 19:11:34 crc kubenswrapper[5033]: I0319 19:11:34.931356 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bn5m8" Mar 19 19:11:34 crc kubenswrapper[5033]: I0319 19:11:34.959764 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bn5m8" podStartSLOduration=3.959743515 podStartE2EDuration="3.959743515s" podCreationTimestamp="2026-03-19 19:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:11:34.955281569 +0000 UTC m=+905.060311428" watchObservedRunningTime="2026-03-19 19:11:34.959743515 +0000 UTC m=+905.064773354" Mar 19 19:11:37 crc kubenswrapper[5033]: I0319 19:11:37.060748 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:37 crc kubenswrapper[5033]: I0319 19:11:37.061119 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:37 crc kubenswrapper[5033]: I0319 19:11:37.105969 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:37 crc kubenswrapper[5033]: I0319 19:11:37.986534 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:38 crc kubenswrapper[5033]: I0319 19:11:38.028768 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:39 crc kubenswrapper[5033]: I0319 19:11:39.965887 5033 generic.go:334] "Generic (PLEG): container finished" podID="e02d5e27-7333-463b-9144-9fa72da9592c" containerID="23d13c557f80dfadee59f4f3f4563a4bc4ade3ff76e61d693379a519b45fee15" exitCode=0 Mar 19 19:11:39 crc kubenswrapper[5033]: I0319 19:11:39.965953 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerDied","Data":"23d13c557f80dfadee59f4f3f4563a4bc4ade3ff76e61d693379a519b45fee15"} Mar 19 19:11:39 crc kubenswrapper[5033]: I0319 19:11:39.968877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" event={"ID":"a8f17329-5c04-45bf-9246-fa2aec1e793f","Type":"ContainerStarted","Data":"0fa4bf59de85790cefaee5fa3b1b18ce16a5038e7fac8cd5460d47ca98d2b61c"} Mar 19 19:11:39 crc kubenswrapper[5033]: I0319 19:11:39.968982 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8k6f" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="registry-server" containerID="cri-o://8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c" gracePeriod=2 Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.308576 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.323340 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" podStartSLOduration=2.260734639 podStartE2EDuration="9.323324077s" podCreationTimestamp="2026-03-19 19:11:31 +0000 UTC" firstStartedPulling="2026-03-19 19:11:32.303996422 +0000 UTC m=+902.409026271" lastFinishedPulling="2026-03-19 19:11:39.36658586 +0000 UTC m=+909.471615709" observedRunningTime="2026-03-19 19:11:40.006748574 +0000 UTC m=+910.111778423" watchObservedRunningTime="2026-03-19 19:11:40.323324077 +0000 UTC m=+910.428353926" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.449332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content\") pod \"9908e686-0752-455b-a658-027cc16ca41c\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.449797 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pthqb\" (UniqueName: \"kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb\") pod \"9908e686-0752-455b-a658-027cc16ca41c\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.449831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities\") pod \"9908e686-0752-455b-a658-027cc16ca41c\" (UID: \"9908e686-0752-455b-a658-027cc16ca41c\") " Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.451098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities" (OuterVolumeSpecName: "utilities") pod "9908e686-0752-455b-a658-027cc16ca41c" (UID: "9908e686-0752-455b-a658-027cc16ca41c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.458140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb" (OuterVolumeSpecName: "kube-api-access-pthqb") pod "9908e686-0752-455b-a658-027cc16ca41c" (UID: "9908e686-0752-455b-a658-027cc16ca41c"). InnerVolumeSpecName "kube-api-access-pthqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.531365 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9908e686-0752-455b-a658-027cc16ca41c" (UID: "9908e686-0752-455b-a658-027cc16ca41c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.551750 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.551782 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pthqb\" (UniqueName: \"kubernetes.io/projected/9908e686-0752-455b-a658-027cc16ca41c-kube-api-access-pthqb\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.551803 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9908e686-0752-455b-a658-027cc16ca41c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.758233 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.758308 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.758347 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.758906 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.758954 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8" gracePeriod=600 Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.976581 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8" exitCode=0 Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.976642 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8"} Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.976683 5033 scope.go:117] "RemoveContainer" containerID="ce82824f000a75252358098db018d1c8aa3c842dfefea3dd71ed110b4364a9cc" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.979242 5033 generic.go:334] "Generic (PLEG): container finished" podID="9908e686-0752-455b-a658-027cc16ca41c" containerID="8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c" exitCode=0 Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.979360 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8k6f" Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.979383 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerDied","Data":"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c"} Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.979506 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8k6f" event={"ID":"9908e686-0752-455b-a658-027cc16ca41c","Type":"ContainerDied","Data":"757b2f3dabd4eb6ab5161d21a306b442e05660e812e3033589de38cb4e22bc17"} Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.981947 5033 generic.go:334] "Generic (PLEG): container finished" podID="e02d5e27-7333-463b-9144-9fa72da9592c" containerID="d903c285d95bda0279b7480dab69f88748f9151fa47b517f0109b6bba155b9e8" exitCode=0 Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.982537 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerDied","Data":"d903c285d95bda0279b7480dab69f88748f9151fa47b517f0109b6bba155b9e8"} Mar 19 19:11:40 crc kubenswrapper[5033]: I0319 19:11:40.982697 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.019418 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.024789 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8k6f"] Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.033552 5033 scope.go:117] "RemoveContainer" containerID="8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.058504 5033 scope.go:117] "RemoveContainer" containerID="d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.098030 5033 scope.go:117] "RemoveContainer" containerID="c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.119872 5033 scope.go:117] "RemoveContainer" containerID="8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c" Mar 19 19:11:41 crc kubenswrapper[5033]: E0319 19:11:41.120352 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c\": container with ID starting with 8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c not found: ID does not exist" containerID="8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.120394 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c"} err="failed to get container status \"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c\": rpc error: code = NotFound desc = could not find container \"8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c\": container with ID starting with 8f363dcbf7caed8b856c8aa7e621aef37fa35c88278e1dfd91ca7152c39ede2c not found: ID does not exist" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.120419 5033 scope.go:117] "RemoveContainer" containerID="d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c" Mar 19 19:11:41 crc kubenswrapper[5033]: E0319 19:11:41.120877 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c\": container with ID starting with d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c not found: ID does not exist" containerID="d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.120902 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c"} err="failed to get container status \"d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c\": rpc error: code = NotFound desc = could not find container \"d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c\": container with ID starting with d1825623e7e2b777c0b120e631279702eed5bb349561dbe1cfa1ef92aa124b0c not found: ID does not exist" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.120918 5033 scope.go:117] "RemoveContainer" containerID="c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8" Mar 19 19:11:41 crc kubenswrapper[5033]: E0319 19:11:41.121410 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8\": container with ID starting with c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8 not found: ID does not exist" containerID="c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.121469 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8"} err="failed to get container status \"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8\": rpc error: code = NotFound desc = could not find container \"c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8\": container with ID starting with c150ed7e0c5a1b9d61da02dd07aae8179c2f58aacf414c7e749148712279ecf8 not found: ID does not exist" Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.992858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750"} Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.996237 5033 generic.go:334] "Generic (PLEG): container finished" podID="e02d5e27-7333-463b-9144-9fa72da9592c" containerID="3b67d454d202eeb7918d593fb26bee488d07917ddc17140f64f974e5df9026fb" exitCode=0 Mar 19 19:11:41 crc kubenswrapper[5033]: I0319 19:11:41.996361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerDied","Data":"3b67d454d202eeb7918d593fb26bee488d07917ddc17140f64f974e5df9026fb"} Mar 19 19:11:42 crc kubenswrapper[5033]: I0319 19:11:42.176797 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-dwtb7" Mar 19 19:11:42 crc kubenswrapper[5033]: I0319 19:11:42.629249 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9908e686-0752-455b-a658-027cc16ca41c" path="/var/lib/kubelet/pods/9908e686-0752-455b-a658-027cc16ca41c/volumes" Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.009017 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"16e2f3ef6ffaaa390ea59175715b20eee3aa197c908498c09dd0c014fd2b3121"} Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.009055 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"698c356c1802dbf25e4b0a289e5b8e423b95748cf7435d600aa6ce46615e78c6"} Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.009064 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"f127567e034727088d21368f0e551eecabf61a613f8b3ac1135c47a1a0efddb4"} Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.009073 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"3dbaaf69d9564dc030f46c4f6ebb0dc048dd9b769984319de7ef3aee60881d90"} Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.009080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"f01efb6e50136f720ba44dab8cf24636904eac6950a7ea8dcc3fa7027c1b8843"} Mar 19 19:11:43 crc kubenswrapper[5033]: I0319 19:11:43.629908 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bn5m8" Mar 19 19:11:44 crc kubenswrapper[5033]: I0319 19:11:44.022013 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pdvpg" event={"ID":"e02d5e27-7333-463b-9144-9fa72da9592c","Type":"ContainerStarted","Data":"559c1ed6bf41374e38fc14d841451697704f221baddd591958cfbfc9c5edae94"} Mar 19 19:11:44 crc kubenswrapper[5033]: I0319 19:11:44.023174 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:44 crc kubenswrapper[5033]: I0319 19:11:44.054807 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pdvpg" podStartSLOduration=5.846541151 podStartE2EDuration="13.05478102s" podCreationTimestamp="2026-03-19 19:11:31 +0000 UTC" firstStartedPulling="2026-03-19 19:11:32.157882318 +0000 UTC m=+902.262912167" lastFinishedPulling="2026-03-19 19:11:39.366122187 +0000 UTC m=+909.471152036" observedRunningTime="2026-03-19 19:11:44.048569906 +0000 UTC m=+914.153599765" watchObservedRunningTime="2026-03-19 19:11:44.05478102 +0000 UTC m=+914.159810889" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.388666 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:46 crc kubenswrapper[5033]: E0319 19:11:46.389339 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="extract-utilities" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.389362 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="extract-utilities" Mar 19 19:11:46 crc kubenswrapper[5033]: E0319 19:11:46.389380 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="extract-content" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.389391 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="extract-content" Mar 19 19:11:46 crc kubenswrapper[5033]: E0319 19:11:46.389407 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="registry-server" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.389418 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="registry-server" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.389667 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9908e686-0752-455b-a658-027cc16ca41c" containerName="registry-server" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.390390 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.392197 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-v6plq" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.392826 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.396113 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.412240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.531552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8qj\" (UniqueName: \"kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj\") pod \"openstack-operator-index-r2t2r\" (UID: \"9fc20a1d-bec9-4345-af63-2c88871568b7\") " pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.632284 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8qj\" (UniqueName: \"kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj\") pod \"openstack-operator-index-r2t2r\" (UID: \"9fc20a1d-bec9-4345-af63-2c88871568b7\") " pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.658425 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8qj\" (UniqueName: \"kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj\") pod \"openstack-operator-index-r2t2r\" (UID: \"9fc20a1d-bec9-4345-af63-2c88871568b7\") " pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:46 crc kubenswrapper[5033]: I0319 19:11:46.714527 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:47 crc kubenswrapper[5033]: I0319 19:11:47.045610 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:47 crc kubenswrapper[5033]: I0319 19:11:47.083545 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:47 crc kubenswrapper[5033]: W0319 19:11:47.203127 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc20a1d_bec9_4345_af63_2c88871568b7.slice/crio-116ac6c64f778bbe107f04494d8724d4bfd622b107fc6cb2925d51c5c2195c6a WatchSource:0}: Error finding container 116ac6c64f778bbe107f04494d8724d4bfd622b107fc6cb2925d51c5c2195c6a: Status 404 returned error can't find the container with id 116ac6c64f778bbe107f04494d8724d4bfd622b107fc6cb2925d51c5c2195c6a Mar 19 19:11:47 crc kubenswrapper[5033]: I0319 19:11:47.207841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:48 crc kubenswrapper[5033]: I0319 19:11:48.058625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2t2r" event={"ID":"9fc20a1d-bec9-4345-af63-2c88871568b7","Type":"ContainerStarted","Data":"116ac6c64f778bbe107f04494d8724d4bfd622b107fc6cb2925d51c5c2195c6a"} Mar 19 19:11:49 crc kubenswrapper[5033]: I0319 19:11:49.360379 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.073385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2t2r" event={"ID":"9fc20a1d-bec9-4345-af63-2c88871568b7","Type":"ContainerStarted","Data":"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5"} Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.073647 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-r2t2r" podUID="9fc20a1d-bec9-4345-af63-2c88871568b7" containerName="registry-server" containerID="cri-o://69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5" gracePeriod=2 Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.107255 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r2t2r" podStartSLOduration=1.94331133 podStartE2EDuration="4.107223104s" podCreationTimestamp="2026-03-19 19:11:46 +0000 UTC" firstStartedPulling="2026-03-19 19:11:47.208521985 +0000 UTC m=+917.313551824" lastFinishedPulling="2026-03-19 19:11:49.372433749 +0000 UTC m=+919.477463598" observedRunningTime="2026-03-19 19:11:50.095034231 +0000 UTC m=+920.200064100" watchObservedRunningTime="2026-03-19 19:11:50.107223104 +0000 UTC m=+920.212252993" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.169155 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nhm8m"] Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.170018 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.182017 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nhm8m"] Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.282422 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7g6\" (UniqueName: \"kubernetes.io/projected/9b05fede-26cf-4467-9222-df3f6af81c0a-kube-api-access-7j7g6\") pod \"openstack-operator-index-nhm8m\" (UID: \"9b05fede-26cf-4467-9222-df3f6af81c0a\") " pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.383713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7g6\" (UniqueName: \"kubernetes.io/projected/9b05fede-26cf-4467-9222-df3f6af81c0a-kube-api-access-7j7g6\") pod \"openstack-operator-index-nhm8m\" (UID: \"9b05fede-26cf-4467-9222-df3f6af81c0a\") " pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.409266 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7g6\" (UniqueName: \"kubernetes.io/projected/9b05fede-26cf-4467-9222-df3f6af81c0a-kube-api-access-7j7g6\") pod \"openstack-operator-index-nhm8m\" (UID: \"9b05fede-26cf-4467-9222-df3f6af81c0a\") " pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.512229 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.549829 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.586210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr8qj\" (UniqueName: \"kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj\") pod \"9fc20a1d-bec9-4345-af63-2c88871568b7\" (UID: \"9fc20a1d-bec9-4345-af63-2c88871568b7\") " Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.594639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj" (OuterVolumeSpecName: "kube-api-access-rr8qj") pod "9fc20a1d-bec9-4345-af63-2c88871568b7" (UID: "9fc20a1d-bec9-4345-af63-2c88871568b7"). InnerVolumeSpecName "kube-api-access-rr8qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.689852 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr8qj\" (UniqueName: \"kubernetes.io/projected/9fc20a1d-bec9-4345-af63-2c88871568b7-kube-api-access-rr8qj\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:50 crc kubenswrapper[5033]: I0319 19:11:50.967497 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nhm8m"] Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.091414 5033 generic.go:334] "Generic (PLEG): container finished" podID="9fc20a1d-bec9-4345-af63-2c88871568b7" containerID="69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5" exitCode=0 Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.091806 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r2t2r" Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.091848 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2t2r" event={"ID":"9fc20a1d-bec9-4345-af63-2c88871568b7","Type":"ContainerDied","Data":"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5"} Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.091949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r2t2r" event={"ID":"9fc20a1d-bec9-4345-af63-2c88871568b7","Type":"ContainerDied","Data":"116ac6c64f778bbe107f04494d8724d4bfd622b107fc6cb2925d51c5c2195c6a"} Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.092121 5033 scope.go:117] "RemoveContainer" containerID="69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5" Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.096938 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nhm8m" event={"ID":"9b05fede-26cf-4467-9222-df3f6af81c0a","Type":"ContainerStarted","Data":"b451476ee1c91bfceaac2c8b9522aadeeb3289561640840f68478284a3cf6a79"} Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.114276 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.120172 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-r2t2r"] Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.120489 5033 scope.go:117] "RemoveContainer" containerID="69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5" Mar 19 19:11:51 crc kubenswrapper[5033]: E0319 19:11:51.121014 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5\": container with ID starting with 69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5 not found: ID does not exist" containerID="69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5" Mar 19 19:11:51 crc kubenswrapper[5033]: I0319 19:11:51.121116 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5"} err="failed to get container status \"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5\": rpc error: code = NotFound desc = could not find container \"69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5\": container with ID starting with 69aa9498eeba49323dc0f08f420fd238c8db7179149e63ebb21fb53b3df1acc5 not found: ID does not exist" Mar 19 19:11:52 crc kubenswrapper[5033]: I0319 19:11:52.044782 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pdvpg" Mar 19 19:11:52 crc kubenswrapper[5033]: I0319 19:11:52.068294 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8tmvb" Mar 19 19:11:52 crc kubenswrapper[5033]: I0319 19:11:52.117088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nhm8m" event={"ID":"9b05fede-26cf-4467-9222-df3f6af81c0a","Type":"ContainerStarted","Data":"048c66c007dd7cae197babe5102c758e640e1d57687c591779fc36b53d3a03fc"} Mar 19 19:11:52 crc kubenswrapper[5033]: I0319 19:11:52.633607 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc20a1d-bec9-4345-af63-2c88871568b7" path="/var/lib/kubelet/pods/9fc20a1d-bec9-4345-af63-2c88871568b7/volumes" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.572188 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nhm8m" podStartSLOduration=3.525028319 podStartE2EDuration="3.572155885s" podCreationTimestamp="2026-03-19 19:11:50 +0000 UTC" firstStartedPulling="2026-03-19 19:11:50.974036053 +0000 UTC m=+921.079065942" lastFinishedPulling="2026-03-19 19:11:51.021163659 +0000 UTC m=+921.126193508" observedRunningTime="2026-03-19 19:11:52.140324438 +0000 UTC m=+922.245354297" watchObservedRunningTime="2026-03-19 19:11:53.572155885 +0000 UTC m=+923.677185774" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.573665 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:11:53 crc kubenswrapper[5033]: E0319 19:11:53.574100 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc20a1d-bec9-4345-af63-2c88871568b7" containerName="registry-server" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.574150 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc20a1d-bec9-4345-af63-2c88871568b7" containerName="registry-server" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.574400 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc20a1d-bec9-4345-af63-2c88871568b7" containerName="registry-server" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.576345 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.589790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.664663 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.664833 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.664927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m748t\" (UniqueName: \"kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.765777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m748t\" (UniqueName: \"kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.765889 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.765975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.766641 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.767384 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.793640 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m748t\" (UniqueName: \"kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t\") pod \"redhat-marketplace-ghgdv\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:53 crc kubenswrapper[5033]: I0319 19:11:53.938850 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:11:54 crc kubenswrapper[5033]: I0319 19:11:54.201916 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:11:54 crc kubenswrapper[5033]: W0319 19:11:54.211905 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a0aa80_a104_4899_ad6c_6c0fc09e997d.slice/crio-3beac164ef8bd020828215560dd8b4f18efb1e534b6702fe601bb747815a6526 WatchSource:0}: Error finding container 3beac164ef8bd020828215560dd8b4f18efb1e534b6702fe601bb747815a6526: Status 404 returned error can't find the container with id 3beac164ef8bd020828215560dd8b4f18efb1e534b6702fe601bb747815a6526 Mar 19 19:11:55 crc kubenswrapper[5033]: I0319 19:11:55.138356 5033 generic.go:334] "Generic (PLEG): container finished" podID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerID="089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811" exitCode=0 Mar 19 19:11:55 crc kubenswrapper[5033]: I0319 19:11:55.138471 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerDied","Data":"089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811"} Mar 19 19:11:55 crc kubenswrapper[5033]: I0319 19:11:55.138667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerStarted","Data":"3beac164ef8bd020828215560dd8b4f18efb1e534b6702fe601bb747815a6526"} Mar 19 19:11:56 crc kubenswrapper[5033]: I0319 19:11:56.148268 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerStarted","Data":"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86"} Mar 19 19:11:57 crc kubenswrapper[5033]: I0319 19:11:57.158548 5033 generic.go:334] "Generic (PLEG): container finished" podID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerID="005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86" exitCode=0 Mar 19 19:11:57 crc kubenswrapper[5033]: I0319 19:11:57.158610 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerDied","Data":"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86"} Mar 19 19:11:58 crc kubenswrapper[5033]: I0319 19:11:58.169045 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerStarted","Data":"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3"} Mar 19 19:11:58 crc kubenswrapper[5033]: I0319 19:11:58.198018 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghgdv" podStartSLOduration=2.770627202 podStartE2EDuration="5.197999739s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="2026-03-19 19:11:55.140256336 +0000 UTC m=+925.245286185" lastFinishedPulling="2026-03-19 19:11:57.567628873 +0000 UTC m=+927.672658722" observedRunningTime="2026-03-19 19:11:58.189775358 +0000 UTC m=+928.294805217" watchObservedRunningTime="2026-03-19 19:11:58.197999739 +0000 UTC m=+928.303029598" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.127764 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565792-b6mbz"] Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.129060 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.130940 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.131196 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.135772 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.137484 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-b6mbz"] Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.261783 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9kt\" (UniqueName: \"kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt\") pod \"auto-csr-approver-29565792-b6mbz\" (UID: \"017b2cf3-89c3-438d-a4f2-cc11221cc49a\") " pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.363562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9kt\" (UniqueName: \"kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt\") pod \"auto-csr-approver-29565792-b6mbz\" (UID: \"017b2cf3-89c3-438d-a4f2-cc11221cc49a\") " pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.385910 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9kt\" (UniqueName: \"kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt\") pod \"auto-csr-approver-29565792-b6mbz\" (UID: \"017b2cf3-89c3-438d-a4f2-cc11221cc49a\") " pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.443697 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.553794 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.554697 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.604369 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:12:00 crc kubenswrapper[5033]: I0319 19:12:00.894425 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-b6mbz"] Mar 19 19:12:00 crc kubenswrapper[5033]: W0319 19:12:00.897109 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod017b2cf3_89c3_438d_a4f2_cc11221cc49a.slice/crio-11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55 WatchSource:0}: Error finding container 11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55: Status 404 returned error can't find the container with id 11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55 Mar 19 19:12:01 crc kubenswrapper[5033]: I0319 19:12:01.187257 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" event={"ID":"017b2cf3-89c3-438d-a4f2-cc11221cc49a","Type":"ContainerStarted","Data":"11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55"} Mar 19 19:12:01 crc kubenswrapper[5033]: I0319 19:12:01.213786 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nhm8m" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.802104 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6"] Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.804520 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.806485 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-f7whp" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.811840 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6"] Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.896972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.897051 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjm2\" (UniqueName: \"kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.897149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.998769 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.998957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.999097 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjm2\" (UniqueName: \"kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.999544 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:02 crc kubenswrapper[5033]: I0319 19:12:02.999707 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.018715 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjm2\" (UniqueName: \"kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.166879 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.203130 5033 generic.go:334] "Generic (PLEG): container finished" podID="017b2cf3-89c3-438d-a4f2-cc11221cc49a" containerID="0c0d4dc9d0d61c82b2904e645db0581653effcaf5751d24ff7d46c670c86f5b0" exitCode=0 Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.203218 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" event={"ID":"017b2cf3-89c3-438d-a4f2-cc11221cc49a","Type":"ContainerDied","Data":"0c0d4dc9d0d61c82b2904e645db0581653effcaf5751d24ff7d46c670c86f5b0"} Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.593119 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6"] Mar 19 19:12:03 crc kubenswrapper[5033]: W0319 19:12:03.602170 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c4876a_af26_49e2_95cf_232f2673a934.slice/crio-306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3 WatchSource:0}: Error finding container 306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3: Status 404 returned error can't find the container with id 306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3 Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.939390 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:03 crc kubenswrapper[5033]: I0319 19:12:03.939744 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.010965 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.212004 5033 generic.go:334] "Generic (PLEG): container finished" podID="78c4876a-af26-49e2-95cf-232f2673a934" containerID="a032d3efcdd4bb281cadde3550b98cdaffaff413eb0a2d91c2a1d4f65a7a6960" exitCode=0 Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.212040 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" event={"ID":"78c4876a-af26-49e2-95cf-232f2673a934","Type":"ContainerDied","Data":"a032d3efcdd4bb281cadde3550b98cdaffaff413eb0a2d91c2a1d4f65a7a6960"} Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.212078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" event={"ID":"78c4876a-af26-49e2-95cf-232f2673a934","Type":"ContainerStarted","Data":"306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3"} Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.255344 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.563637 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.723035 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf9kt\" (UniqueName: \"kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt\") pod \"017b2cf3-89c3-438d-a4f2-cc11221cc49a\" (UID: \"017b2cf3-89c3-438d-a4f2-cc11221cc49a\") " Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.727507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt" (OuterVolumeSpecName: "kube-api-access-nf9kt") pod "017b2cf3-89c3-438d-a4f2-cc11221cc49a" (UID: "017b2cf3-89c3-438d-a4f2-cc11221cc49a"). InnerVolumeSpecName "kube-api-access-nf9kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:04 crc kubenswrapper[5033]: I0319 19:12:04.824105 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf9kt\" (UniqueName: \"kubernetes.io/projected/017b2cf3-89c3-438d-a4f2-cc11221cc49a-kube-api-access-nf9kt\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.220400 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" event={"ID":"017b2cf3-89c3-438d-a4f2-cc11221cc49a","Type":"ContainerDied","Data":"11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55"} Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.220702 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a17ab0aa3cb12ada71e9f6737584cf892fc47dc45c06dee5f5e0afc82ffa55" Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.220436 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-b6mbz" Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.222556 5033 generic.go:334] "Generic (PLEG): container finished" podID="78c4876a-af26-49e2-95cf-232f2673a934" containerID="0afa205dee29865d45c859747bbbe18df83ae7225a372172eccf5e7844fc0b5c" exitCode=0 Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.222581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" event={"ID":"78c4876a-af26-49e2-95cf-232f2673a934","Type":"ContainerDied","Data":"0afa205dee29865d45c859747bbbe18df83ae7225a372172eccf5e7844fc0b5c"} Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.633597 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-njps6"] Mar 19 19:12:05 crc kubenswrapper[5033]: I0319 19:12:05.641652 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-njps6"] Mar 19 19:12:06 crc kubenswrapper[5033]: I0319 19:12:06.234382 5033 generic.go:334] "Generic (PLEG): container finished" podID="78c4876a-af26-49e2-95cf-232f2673a934" containerID="51bf78743f09e03fc334c272ae2aff7dba4df3b66b566ae216487f168fd42a1d" exitCode=0 Mar 19 19:12:06 crc kubenswrapper[5033]: I0319 19:12:06.234424 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" event={"ID":"78c4876a-af26-49e2-95cf-232f2673a934","Type":"ContainerDied","Data":"51bf78743f09e03fc334c272ae2aff7dba4df3b66b566ae216487f168fd42a1d"} Mar 19 19:12:06 crc kubenswrapper[5033]: I0319 19:12:06.632957 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5823e80-5bf6-4248-b034-01d39e46d318" path="/var/lib/kubelet/pods/f5823e80-5bf6-4248-b034-01d39e46d318/volumes" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.557531 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.557739 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghgdv" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="registry-server" containerID="cri-o://e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3" gracePeriod=2 Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.582443 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.763967 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util\") pod \"78c4876a-af26-49e2-95cf-232f2673a934\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.764414 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle\") pod \"78c4876a-af26-49e2-95cf-232f2673a934\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.765696 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle" (OuterVolumeSpecName: "bundle") pod "78c4876a-af26-49e2-95cf-232f2673a934" (UID: "78c4876a-af26-49e2-95cf-232f2673a934"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.766011 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxjm2\" (UniqueName: \"kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2\") pod \"78c4876a-af26-49e2-95cf-232f2673a934\" (UID: \"78c4876a-af26-49e2-95cf-232f2673a934\") " Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.766651 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.777318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2" (OuterVolumeSpecName: "kube-api-access-mxjm2") pod "78c4876a-af26-49e2-95cf-232f2673a934" (UID: "78c4876a-af26-49e2-95cf-232f2673a934"). InnerVolumeSpecName "kube-api-access-mxjm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.787428 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util" (OuterVolumeSpecName: "util") pod "78c4876a-af26-49e2-95cf-232f2673a934" (UID: "78c4876a-af26-49e2-95cf-232f2673a934"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.868387 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78c4876a-af26-49e2-95cf-232f2673a934-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.868427 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxjm2\" (UniqueName: \"kubernetes.io/projected/78c4876a-af26-49e2-95cf-232f2673a934-kube-api-access-mxjm2\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:07 crc kubenswrapper[5033]: I0319 19:12:07.938682 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.071775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities\") pod \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.071856 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content\") pod \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.071892 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m748t\" (UniqueName: \"kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t\") pod \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\" (UID: \"94a0aa80-a104-4899-ad6c-6c0fc09e997d\") " Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.073541 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities" (OuterVolumeSpecName: "utilities") pod "94a0aa80-a104-4899-ad6c-6c0fc09e997d" (UID: "94a0aa80-a104-4899-ad6c-6c0fc09e997d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.075818 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t" (OuterVolumeSpecName: "kube-api-access-m748t") pod "94a0aa80-a104-4899-ad6c-6c0fc09e997d" (UID: "94a0aa80-a104-4899-ad6c-6c0fc09e997d"). InnerVolumeSpecName "kube-api-access-m748t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.097382 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a0aa80-a104-4899-ad6c-6c0fc09e997d" (UID: "94a0aa80-a104-4899-ad6c-6c0fc09e997d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.173318 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.173367 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0aa80-a104-4899-ad6c-6c0fc09e997d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.173411 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m748t\" (UniqueName: \"kubernetes.io/projected/94a0aa80-a104-4899-ad6c-6c0fc09e997d-kube-api-access-m748t\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.251787 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" event={"ID":"78c4876a-af26-49e2-95cf-232f2673a934","Type":"ContainerDied","Data":"306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3"} Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.251826 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306f1ea2f4d467c174882e5a31525e064d61e56f8647eab56f97b4e83030feb3" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.251797 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.253806 5033 generic.go:334] "Generic (PLEG): container finished" podID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerID="e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3" exitCode=0 Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.253841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerDied","Data":"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3"} Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.253866 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghgdv" event={"ID":"94a0aa80-a104-4899-ad6c-6c0fc09e997d","Type":"ContainerDied","Data":"3beac164ef8bd020828215560dd8b4f18efb1e534b6702fe601bb747815a6526"} Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.253873 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghgdv" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.253887 5033 scope.go:117] "RemoveContainer" containerID="e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.271198 5033 scope.go:117] "RemoveContainer" containerID="005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.288248 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.288284 5033 scope.go:117] "RemoveContainer" containerID="089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.294680 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghgdv"] Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.301339 5033 scope.go:117] "RemoveContainer" containerID="e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3" Mar 19 19:12:08 crc kubenswrapper[5033]: E0319 19:12:08.301826 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3\": container with ID starting with e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3 not found: ID does not exist" containerID="e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.301880 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3"} err="failed to get container status \"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3\": rpc error: code = NotFound desc = could not find container \"e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3\": container with ID starting with e04d1696ed265c7602eecccad79f09bc44cc1e62bb5552ad2d24a3135be6dbc3 not found: ID does not exist" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.301915 5033 scope.go:117] "RemoveContainer" containerID="005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86" Mar 19 19:12:08 crc kubenswrapper[5033]: E0319 19:12:08.302258 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86\": container with ID starting with 005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86 not found: ID does not exist" containerID="005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.302294 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86"} err="failed to get container status \"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86\": rpc error: code = NotFound desc = could not find container \"005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86\": container with ID starting with 005d57ddbc214ba6563651a780b20a821d38c1a90bface315a6ae07589159d86 not found: ID does not exist" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.302316 5033 scope.go:117] "RemoveContainer" containerID="089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811" Mar 19 19:12:08 crc kubenswrapper[5033]: E0319 19:12:08.302670 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811\": container with ID starting with 089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811 not found: ID does not exist" containerID="089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.302713 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811"} err="failed to get container status \"089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811\": rpc error: code = NotFound desc = could not find container \"089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811\": container with ID starting with 089341ae295c2abc9cff07689f33b9a3ebca5568ef724f399b964fed98f93811 not found: ID does not exist" Mar 19 19:12:08 crc kubenswrapper[5033]: I0319 19:12:08.629140 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" path="/var/lib/kubelet/pods/94a0aa80-a104-4899-ad6c-6c0fc09e997d/volumes" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.922670 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l"] Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923169 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="pull" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923183 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="pull" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923195 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="extract-utilities" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="extract-utilities" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923212 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="extract-content" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923220 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="extract-content" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923232 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="extract" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923239 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="extract" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923259 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="registry-server" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923265 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="registry-server" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923271 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="util" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923276 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="util" Mar 19 19:12:10 crc kubenswrapper[5033]: E0319 19:12:10.923286 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2cf3-89c3-438d-a4f2-cc11221cc49a" containerName="oc" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923292 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2cf3-89c3-438d-a4f2-cc11221cc49a" containerName="oc" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923401 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2cf3-89c3-438d-a4f2-cc11221cc49a" containerName="oc" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923411 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c4876a-af26-49e2-95cf-232f2673a934" containerName="extract" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923419 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a0aa80-a104-4899-ad6c-6c0fc09e997d" containerName="registry-server" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.923915 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:10 crc kubenswrapper[5033]: I0319 19:12:10.928223 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-lpwv5" Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.024006 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l"] Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.109588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8zfp\" (UniqueName: \"kubernetes.io/projected/ea548f00-7f7c-4716-acb1-8bf41a3a9e9e-kube-api-access-w8zfp\") pod \"openstack-operator-controller-init-6c6f68556d-szp9l\" (UID: \"ea548f00-7f7c-4716-acb1-8bf41a3a9e9e\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.210567 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8zfp\" (UniqueName: \"kubernetes.io/projected/ea548f00-7f7c-4716-acb1-8bf41a3a9e9e-kube-api-access-w8zfp\") pod \"openstack-operator-controller-init-6c6f68556d-szp9l\" (UID: \"ea548f00-7f7c-4716-acb1-8bf41a3a9e9e\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.230237 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8zfp\" (UniqueName: \"kubernetes.io/projected/ea548f00-7f7c-4716-acb1-8bf41a3a9e9e-kube-api-access-w8zfp\") pod \"openstack-operator-controller-init-6c6f68556d-szp9l\" (UID: \"ea548f00-7f7c-4716-acb1-8bf41a3a9e9e\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.240263 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:11 crc kubenswrapper[5033]: I0319 19:12:11.473987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l"] Mar 19 19:12:12 crc kubenswrapper[5033]: I0319 19:12:12.291801 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" event={"ID":"ea548f00-7f7c-4716-acb1-8bf41a3a9e9e","Type":"ContainerStarted","Data":"2c00fff2ef17b18373879a2e478897e7a8a337fbd36878c9648cf61217916d50"} Mar 19 19:12:16 crc kubenswrapper[5033]: I0319 19:12:16.333419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" event={"ID":"ea548f00-7f7c-4716-acb1-8bf41a3a9e9e","Type":"ContainerStarted","Data":"62ba1322bc6dcf717336593cf811f2eebb035d322df07467b219cda42e8b6ab3"} Mar 19 19:12:16 crc kubenswrapper[5033]: I0319 19:12:16.333725 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:16 crc kubenswrapper[5033]: I0319 19:12:16.358206 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" podStartSLOduration=2.666422019 podStartE2EDuration="6.358191282s" podCreationTimestamp="2026-03-19 19:12:10 +0000 UTC" firstStartedPulling="2026-03-19 19:12:11.480930403 +0000 UTC m=+941.585960242" lastFinishedPulling="2026-03-19 19:12:15.172699656 +0000 UTC m=+945.277729505" observedRunningTime="2026-03-19 19:12:16.356527815 +0000 UTC m=+946.461557674" watchObservedRunningTime="2026-03-19 19:12:16.358191282 +0000 UTC m=+946.463221131" Mar 19 19:12:21 crc kubenswrapper[5033]: I0319 19:12:21.243902 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-szp9l" Mar 19 19:12:31 crc kubenswrapper[5033]: I0319 19:12:31.297146 5033 scope.go:117] "RemoveContainer" containerID="530d4216967b7b9f17736b717db343b3048ea0a8aadb79c76916c810a64c439f" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.262406 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.263964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.270424 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vffsj" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.280601 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.282094 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.284655 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4vzf8" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.300790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.313059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.328513 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.332286 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.339318 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vs9l9" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.374752 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.406633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4f96\" (UniqueName: \"kubernetes.io/projected/ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74-kube-api-access-m4f96\") pod \"cinder-operator-controller-manager-8d58dc466-t7sd4\" (UID: \"ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.409769 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmm5\" (UniqueName: \"kubernetes.io/projected/959b54c4-e249-46da-a57f-e997e6944147-kube-api-access-xpmm5\") pod \"barbican-operator-controller-manager-59bc569d95-zjvmm\" (UID: \"959b54c4-e249-46da-a57f-e997e6944147\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.433702 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.440331 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.444307 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-85r7x" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.448637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.457772 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.458774 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.463122 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wkc5v" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.464259 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.464992 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.473202 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bdfrw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.473744 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.475013 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.481753 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.482623 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.487251 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.487285 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tmhqt" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.495114 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.495987 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.498113 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rqbnw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.499555 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.511031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwm5\" (UniqueName: \"kubernetes.io/projected/e7453f1e-da50-4386-ac0d-64d309e192b0-kube-api-access-9jwm5\") pod \"designate-operator-controller-manager-588d4d986b-5rrzc\" (UID: \"e7453f1e-da50-4386-ac0d-64d309e192b0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.511137 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmm5\" (UniqueName: \"kubernetes.io/projected/959b54c4-e249-46da-a57f-e997e6944147-kube-api-access-xpmm5\") pod \"barbican-operator-controller-manager-59bc569d95-zjvmm\" (UID: \"959b54c4-e249-46da-a57f-e997e6944147\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.511174 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4f96\" (UniqueName: \"kubernetes.io/projected/ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74-kube-api-access-m4f96\") pod \"cinder-operator-controller-manager-8d58dc466-t7sd4\" (UID: \"ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.515872 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.517217 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.520196 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cz58r" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.535993 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.540855 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmm5\" (UniqueName: \"kubernetes.io/projected/959b54c4-e249-46da-a57f-e997e6944147-kube-api-access-xpmm5\") pod \"barbican-operator-controller-manager-59bc569d95-zjvmm\" (UID: \"959b54c4-e249-46da-a57f-e997e6944147\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.547756 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.548725 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4f96\" (UniqueName: \"kubernetes.io/projected/ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74-kube-api-access-m4f96\") pod \"cinder-operator-controller-manager-8d58dc466-t7sd4\" (UID: \"ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.551176 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.552105 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.558861 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jhctp" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.562484 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.563300 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.568532 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.573189 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-k6w2j" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.578563 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.595379 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4nfws"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.596501 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.599942 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-k59md" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.600480 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhc5\" (UniqueName: \"kubernetes.io/projected/344bdd4e-7b37-4bd7-b403-58a5aa242946-kube-api-access-qrhc5\") pod \"heat-operator-controller-manager-67dd5f86f5-h4rn2\" (UID: \"344bdd4e-7b37-4bd7-b403-58a5aa242946\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612156 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlmh\" (UniqueName: \"kubernetes.io/projected/b5102fa8-cc34-4f68-a4af-f26a243c3238-kube-api-access-mjlmh\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612178 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612208 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwm5\" (UniqueName: \"kubernetes.io/projected/e7453f1e-da50-4386-ac0d-64d309e192b0-kube-api-access-9jwm5\") pod \"designate-operator-controller-manager-588d4d986b-5rrzc\" (UID: \"e7453f1e-da50-4386-ac0d-64d309e192b0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612234 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ncg\" (UniqueName: \"kubernetes.io/projected/80d312b1-fa3e-4746-baaf-1aa74b1a6a46-kube-api-access-m4ncg\") pod \"ironic-operator-controller-manager-6f787dddc9-lf4wr\" (UID: \"80d312b1-fa3e-4746-baaf-1aa74b1a6a46\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612263 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56z8g\" (UniqueName: \"kubernetes.io/projected/58ad3c6c-eec2-41dd-98ac-0ff2454ba608-kube-api-access-56z8g\") pod \"horizon-operator-controller-manager-8464cc45fb-xtxhs\" (UID: \"58ad3c6c-eec2-41dd-98ac-0ff2454ba608\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.612287 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8zn\" (UniqueName: \"kubernetes.io/projected/d565f597-d2c9-45cf-84fd-81986897c7ec-kube-api-access-hw8zn\") pod \"glance-operator-controller-manager-79df6bcc97-n8ssn\" (UID: \"d565f597-d2c9-45cf-84fd-81986897c7ec\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.619055 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4nfws"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.619230 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.626718 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.628010 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.631049 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-d4d25" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.633421 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwm5\" (UniqueName: \"kubernetes.io/projected/e7453f1e-da50-4386-ac0d-64d309e192b0-kube-api-access-9jwm5\") pod \"designate-operator-controller-manager-588d4d986b-5rrzc\" (UID: \"e7453f1e-da50-4386-ac0d-64d309e192b0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.653978 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.669977 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.670985 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.671962 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.678066 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.678683 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6s2vs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.694708 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.695620 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.700958 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.701041 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-drctk" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.710756 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713299 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlmh\" (UniqueName: \"kubernetes.io/projected/b5102fa8-cc34-4f68-a4af-f26a243c3238-kube-api-access-mjlmh\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713339 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713382 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bngx\" (UniqueName: \"kubernetes.io/projected/4e5bdc34-6776-4463-badb-8666194c5f89-kube-api-access-8bngx\") pod \"neutron-operator-controller-manager-767865f676-4nfws\" (UID: \"4e5bdc34-6776-4463-badb-8666194c5f89\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713404 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ncg\" (UniqueName: \"kubernetes.io/projected/80d312b1-fa3e-4746-baaf-1aa74b1a6a46-kube-api-access-m4ncg\") pod \"ironic-operator-controller-manager-6f787dddc9-lf4wr\" (UID: \"80d312b1-fa3e-4746-baaf-1aa74b1a6a46\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56z8g\" (UniqueName: \"kubernetes.io/projected/58ad3c6c-eec2-41dd-98ac-0ff2454ba608-kube-api-access-56z8g\") pod \"horizon-operator-controller-manager-8464cc45fb-xtxhs\" (UID: \"58ad3c6c-eec2-41dd-98ac-0ff2454ba608\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8zn\" (UniqueName: \"kubernetes.io/projected/d565f597-d2c9-45cf-84fd-81986897c7ec-kube-api-access-hw8zn\") pod \"glance-operator-controller-manager-79df6bcc97-n8ssn\" (UID: \"d565f597-d2c9-45cf-84fd-81986897c7ec\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzv6c\" (UniqueName: \"kubernetes.io/projected/b34dd812-d36d-4bfb-93df-03caacef3d64-kube-api-access-qzv6c\") pod \"keystone-operator-controller-manager-768b96df4c-dfthw\" (UID: \"b34dd812-d36d-4bfb-93df-03caacef3d64\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713524 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4jq\" (UniqueName: \"kubernetes.io/projected/22389b01-1060-4ddb-8fc3-3d3faeafefd2-kube-api-access-sp4jq\") pod \"manila-operator-controller-manager-55f864c847-j8lbc\" (UID: \"22389b01-1060-4ddb-8fc3-3d3faeafefd2\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713553 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhc5\" (UniqueName: \"kubernetes.io/projected/344bdd4e-7b37-4bd7-b403-58a5aa242946-kube-api-access-qrhc5\") pod \"heat-operator-controller-manager-67dd5f86f5-h4rn2\" (UID: \"344bdd4e-7b37-4bd7-b403-58a5aa242946\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.713592 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqlt\" (UniqueName: \"kubernetes.io/projected/98807d9f-1747-4534-9727-57a6d81775b6-kube-api-access-kmqlt\") pod \"mariadb-operator-controller-manager-67ccfc9778-r8r84\" (UID: \"98807d9f-1747-4534-9727-57a6d81775b6\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:41 crc kubenswrapper[5033]: E0319 19:12:41.713930 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:41 crc kubenswrapper[5033]: E0319 19:12:41.713977 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:42.213961772 +0000 UTC m=+972.318991621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.723760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.724636 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.726806 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5q9bc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.735780 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.748878 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.777612 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.790363 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56z8g\" (UniqueName: \"kubernetes.io/projected/58ad3c6c-eec2-41dd-98ac-0ff2454ba608-kube-api-access-56z8g\") pod \"horizon-operator-controller-manager-8464cc45fb-xtxhs\" (UID: \"58ad3c6c-eec2-41dd-98ac-0ff2454ba608\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.798321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wdvcs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.803090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhc5\" (UniqueName: \"kubernetes.io/projected/344bdd4e-7b37-4bd7-b403-58a5aa242946-kube-api-access-qrhc5\") pod \"heat-operator-controller-manager-67dd5f86f5-h4rn2\" (UID: \"344bdd4e-7b37-4bd7-b403-58a5aa242946\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.806244 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.811786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8zn\" (UniqueName: \"kubernetes.io/projected/d565f597-d2c9-45cf-84fd-81986897c7ec-kube-api-access-hw8zn\") pod \"glance-operator-controller-manager-79df6bcc97-n8ssn\" (UID: \"d565f597-d2c9-45cf-84fd-81986897c7ec\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.817847 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqt8z\" (UniqueName: \"kubernetes.io/projected/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-kube-api-access-bqt8z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.817912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqlt\" (UniqueName: \"kubernetes.io/projected/98807d9f-1747-4534-9727-57a6d81775b6-kube-api-access-kmqlt\") pod \"mariadb-operator-controller-manager-67ccfc9778-r8r84\" (UID: \"98807d9f-1747-4534-9727-57a6d81775b6\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.817936 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.818005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bngx\" (UniqueName: \"kubernetes.io/projected/4e5bdc34-6776-4463-badb-8666194c5f89-kube-api-access-8bngx\") pod \"neutron-operator-controller-manager-767865f676-4nfws\" (UID: \"4e5bdc34-6776-4463-badb-8666194c5f89\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.818055 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzv6c\" (UniqueName: \"kubernetes.io/projected/b34dd812-d36d-4bfb-93df-03caacef3d64-kube-api-access-qzv6c\") pod \"keystone-operator-controller-manager-768b96df4c-dfthw\" (UID: \"b34dd812-d36d-4bfb-93df-03caacef3d64\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.818076 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrmn\" (UniqueName: \"kubernetes.io/projected/c8a93f31-4443-4247-b4de-d0bc1e26c6f8-kube-api-access-mwrmn\") pod \"nova-operator-controller-manager-5d488d59fb-n9lpt\" (UID: \"c8a93f31-4443-4247-b4de-d0bc1e26c6f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.818105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pjxp\" (UniqueName: \"kubernetes.io/projected/336483e6-616b-4489-bd29-5f7b62ef0d45-kube-api-access-5pjxp\") pod \"octavia-operator-controller-manager-5b9f45d989-hn4dm\" (UID: \"336483e6-616b-4489-bd29-5f7b62ef0d45\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.818128 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4jq\" (UniqueName: \"kubernetes.io/projected/22389b01-1060-4ddb-8fc3-3d3faeafefd2-kube-api-access-sp4jq\") pod \"manila-operator-controller-manager-55f864c847-j8lbc\" (UID: \"22389b01-1060-4ddb-8fc3-3d3faeafefd2\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.832400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlmh\" (UniqueName: \"kubernetes.io/projected/b5102fa8-cc34-4f68-a4af-f26a243c3238-kube-api-access-mjlmh\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.832829 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ncg\" (UniqueName: \"kubernetes.io/projected/80d312b1-fa3e-4746-baaf-1aa74b1a6a46-kube-api-access-m4ncg\") pod \"ironic-operator-controller-manager-6f787dddc9-lf4wr\" (UID: \"80d312b1-fa3e-4746-baaf-1aa74b1a6a46\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.873167 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.876255 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqlt\" (UniqueName: \"kubernetes.io/projected/98807d9f-1747-4534-9727-57a6d81775b6-kube-api-access-kmqlt\") pod \"mariadb-operator-controller-manager-67ccfc9778-r8r84\" (UID: \"98807d9f-1747-4534-9727-57a6d81775b6\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.879066 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4jq\" (UniqueName: \"kubernetes.io/projected/22389b01-1060-4ddb-8fc3-3d3faeafefd2-kube-api-access-sp4jq\") pod \"manila-operator-controller-manager-55f864c847-j8lbc\" (UID: \"22389b01-1060-4ddb-8fc3-3d3faeafefd2\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.882831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bngx\" (UniqueName: \"kubernetes.io/projected/4e5bdc34-6776-4463-badb-8666194c5f89-kube-api-access-8bngx\") pod \"neutron-operator-controller-manager-767865f676-4nfws\" (UID: \"4e5bdc34-6776-4463-badb-8666194c5f89\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.898823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.911046 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.921579 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzv6c\" (UniqueName: \"kubernetes.io/projected/b34dd812-d36d-4bfb-93df-03caacef3d64-kube-api-access-qzv6c\") pod \"keystone-operator-controller-manager-768b96df4c-dfthw\" (UID: \"b34dd812-d36d-4bfb-93df-03caacef3d64\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.922076 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zczlg"] Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.959809 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.961352 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.970358 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-wt99z" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.970902 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.974630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrmn\" (UniqueName: \"kubernetes.io/projected/c8a93f31-4443-4247-b4de-d0bc1e26c6f8-kube-api-access-mwrmn\") pod \"nova-operator-controller-manager-5d488d59fb-n9lpt\" (UID: \"c8a93f31-4443-4247-b4de-d0bc1e26c6f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.974803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gnn\" (UniqueName: \"kubernetes.io/projected/ae6d87d9-9ab4-4a9c-84f4-e5913246875e-kube-api-access-46gnn\") pod \"swift-operator-controller-manager-c674c5965-zczlg\" (UID: \"ae6d87d9-9ab4-4a9c-84f4-e5913246875e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.974896 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hfq5\" (UniqueName: \"kubernetes.io/projected/1d6c9166-95cd-42d6-9782-56e8879c0412-kube-api-access-7hfq5\") pod \"ovn-operator-controller-manager-884679f54-dgp4k\" (UID: \"1d6c9166-95cd-42d6-9782-56e8879c0412\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.974971 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pjxp\" (UniqueName: \"kubernetes.io/projected/336483e6-616b-4489-bd29-5f7b62ef0d45-kube-api-access-5pjxp\") pod \"octavia-operator-controller-manager-5b9f45d989-hn4dm\" (UID: \"336483e6-616b-4489-bd29-5f7b62ef0d45\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.975060 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqt8z\" (UniqueName: \"kubernetes.io/projected/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-kube-api-access-bqt8z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.981057 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.981191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlb2t\" (UniqueName: \"kubernetes.io/projected/f160f298-8509-46b1-865f-7241c0dd299f-kube-api-access-hlb2t\") pod \"placement-operator-controller-manager-5784578c99-w2pfb\" (UID: \"f160f298-8509-46b1-865f-7241c0dd299f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:41 crc kubenswrapper[5033]: E0319 19:12:41.981945 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:41 crc kubenswrapper[5033]: E0319 19:12:41.982046 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:42.482032783 +0000 UTC m=+972.587062632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:41 crc kubenswrapper[5033]: I0319 19:12:41.996826 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zczlg"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.029253 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pjxp\" (UniqueName: \"kubernetes.io/projected/336483e6-616b-4489-bd29-5f7b62ef0d45-kube-api-access-5pjxp\") pod \"octavia-operator-controller-manager-5b9f45d989-hn4dm\" (UID: \"336483e6-616b-4489-bd29-5f7b62ef0d45\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.033385 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrmn\" (UniqueName: \"kubernetes.io/projected/c8a93f31-4443-4247-b4de-d0bc1e26c6f8-kube-api-access-mwrmn\") pod \"nova-operator-controller-manager-5d488d59fb-n9lpt\" (UID: \"c8a93f31-4443-4247-b4de-d0bc1e26c6f8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.034031 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqt8z\" (UniqueName: \"kubernetes.io/projected/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-kube-api-access-bqt8z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.049332 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.064753 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.065506 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.066393 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.072763 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7w6mv" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.082049 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlb2t\" (UniqueName: \"kubernetes.io/projected/f160f298-8509-46b1-865f-7241c0dd299f-kube-api-access-hlb2t\") pod \"placement-operator-controller-manager-5784578c99-w2pfb\" (UID: \"f160f298-8509-46b1-865f-7241c0dd299f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.082114 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqkn\" (UniqueName: \"kubernetes.io/projected/2f3d3208-9746-409f-95e9-7ada3c61671d-kube-api-access-6hqkn\") pod \"telemetry-operator-controller-manager-6c5c766d94-gvt8l\" (UID: \"2f3d3208-9746-409f-95e9-7ada3c61671d\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.082171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gnn\" (UniqueName: \"kubernetes.io/projected/ae6d87d9-9ab4-4a9c-84f4-e5913246875e-kube-api-access-46gnn\") pod \"swift-operator-controller-manager-c674c5965-zczlg\" (UID: \"ae6d87d9-9ab4-4a9c-84f4-e5913246875e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.082195 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hfq5\" (UniqueName: \"kubernetes.io/projected/1d6c9166-95cd-42d6-9782-56e8879c0412-kube-api-access-7hfq5\") pod \"ovn-operator-controller-manager-884679f54-dgp4k\" (UID: \"1d6c9166-95cd-42d6-9782-56e8879c0412\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.097767 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.098324 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.117417 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hfq5\" (UniqueName: \"kubernetes.io/projected/1d6c9166-95cd-42d6-9782-56e8879c0412-kube-api-access-7hfq5\") pod \"ovn-operator-controller-manager-884679f54-dgp4k\" (UID: \"1d6c9166-95cd-42d6-9782-56e8879c0412\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.117757 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.118648 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.122368 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gnn\" (UniqueName: \"kubernetes.io/projected/ae6d87d9-9ab4-4a9c-84f4-e5913246875e-kube-api-access-46gnn\") pod \"swift-operator-controller-manager-c674c5965-zczlg\" (UID: \"ae6d87d9-9ab4-4a9c-84f4-e5913246875e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.131005 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.131222 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xsc2w" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.137079 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlb2t\" (UniqueName: \"kubernetes.io/projected/f160f298-8509-46b1-865f-7241c0dd299f-kube-api-access-hlb2t\") pod \"placement-operator-controller-manager-5784578c99-w2pfb\" (UID: \"f160f298-8509-46b1-865f-7241c0dd299f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.147969 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.148830 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.160226 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-njcz9" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.186566 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.187040 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw5s\" (UniqueName: \"kubernetes.io/projected/fe433b1b-c379-493b-8e5a-74dff21a208d-kube-api-access-ssw5s\") pod \"test-operator-controller-manager-5c5cb9c4d7-8ggqw\" (UID: \"fe433b1b-c379-493b-8e5a-74dff21a208d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.187087 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwzx\" (UniqueName: \"kubernetes.io/projected/16cc55d4-44b6-4e27-9afd-484d4db42d1a-kube-api-access-hhwzx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-87zz5\" (UID: \"16cc55d4-44b6-4e27-9afd-484d4db42d1a\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.187131 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqkn\" (UniqueName: \"kubernetes.io/projected/2f3d3208-9746-409f-95e9-7ada3c61671d-kube-api-access-6hqkn\") pod \"telemetry-operator-controller-manager-6c5c766d94-gvt8l\" (UID: \"2f3d3208-9746-409f-95e9-7ada3c61671d\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.188749 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.197841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.217615 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqkn\" (UniqueName: \"kubernetes.io/projected/2f3d3208-9746-409f-95e9-7ada3c61671d-kube-api-access-6hqkn\") pod \"telemetry-operator-controller-manager-6c5c766d94-gvt8l\" (UID: \"2f3d3208-9746-409f-95e9-7ada3c61671d\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.227673 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.228515 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.230814 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xqfm8" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.230999 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.231567 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.248138 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.272497 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.273389 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.275290 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-77v79" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.278365 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwzx\" (UniqueName: \"kubernetes.io/projected/16cc55d4-44b6-4e27-9afd-484d4db42d1a-kube-api-access-hhwzx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-87zz5\" (UID: \"16cc55d4-44b6-4e27-9afd-484d4db42d1a\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2jh\" (UniqueName: \"kubernetes.io/projected/668a0189-af32-4674-8c48-101dac5c1e55-kube-api-access-xx2jh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4cn7g\" (UID: \"668a0189-af32-4674-8c48-101dac5c1e55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288550 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288599 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnkff\" (UniqueName: \"kubernetes.io/projected/6a3c6b85-334f-431c-8840-ca3cf37451a9-kube-api-access-hnkff\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288721 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw5s\" (UniqueName: \"kubernetes.io/projected/fe433b1b-c379-493b-8e5a-74dff21a208d-kube-api-access-ssw5s\") pod \"test-operator-controller-manager-5c5cb9c4d7-8ggqw\" (UID: \"fe433b1b-c379-493b-8e5a-74dff21a208d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.288742 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.289144 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.289188 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:43.289173205 +0000 UTC m=+973.394203054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.306840 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.311630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw5s\" (UniqueName: \"kubernetes.io/projected/fe433b1b-c379-493b-8e5a-74dff21a208d-kube-api-access-ssw5s\") pod \"test-operator-controller-manager-5c5cb9c4d7-8ggqw\" (UID: \"fe433b1b-c379-493b-8e5a-74dff21a208d\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.316759 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.321243 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwzx\" (UniqueName: \"kubernetes.io/projected/16cc55d4-44b6-4e27-9afd-484d4db42d1a-kube-api-access-hhwzx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-87zz5\" (UID: \"16cc55d4-44b6-4e27-9afd-484d4db42d1a\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.354759 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.390098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.390212 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.390263 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2jh\" (UniqueName: \"kubernetes.io/projected/668a0189-af32-4674-8c48-101dac5c1e55-kube-api-access-xx2jh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4cn7g\" (UID: \"668a0189-af32-4674-8c48-101dac5c1e55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.390310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnkff\" (UniqueName: \"kubernetes.io/projected/6a3c6b85-334f-431c-8840-ca3cf37451a9-kube-api-access-hnkff\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.390392 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.390517 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:42.890496086 +0000 UTC m=+972.995525935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.390644 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.390734 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:42.890709402 +0000 UTC m=+972.995739251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.394555 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.398818 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.410574 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnkff\" (UniqueName: \"kubernetes.io/projected/6a3c6b85-334f-431c-8840-ca3cf37451a9-kube-api-access-hnkff\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.411617 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2jh\" (UniqueName: \"kubernetes.io/projected/668a0189-af32-4674-8c48-101dac5c1e55-kube-api-access-xx2jh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4cn7g\" (UID: \"668a0189-af32-4674-8c48-101dac5c1e55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.462980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.492303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.492669 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.492731 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:43.492711002 +0000 UTC m=+973.597740851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.500070 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.502392 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.561523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" event={"ID":"ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74","Type":"ContainerStarted","Data":"9d7a723ae4c440f3847c2612652c09fb4ef33f480059fab342f97cac4220c75b"} Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.652666 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.898842 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.899092 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.899231 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.899277 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:43.899264171 +0000 UTC m=+974.004294020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.899316 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: E0319 19:12:42.899335 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:43.899328943 +0000 UTC m=+974.004358792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.981333 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc"] Mar 19 19:12:42 crc kubenswrapper[5033]: I0319 19:12:42.991240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.105127 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84"] Mar 19 19:12:43 crc kubenswrapper[5033]: W0319 19:12:43.115289 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98807d9f_1747_4534_9727_57a6d81775b6.slice/crio-5db97b101a3b31497d2af7035d8a4e3556f9100e7ef4b542618060ed621c3503 WatchSource:0}: Error finding container 5db97b101a3b31497d2af7035d8a4e3556f9100e7ef4b542618060ed621c3503: Status 404 returned error can't find the container with id 5db97b101a3b31497d2af7035d8a4e3556f9100e7ef4b542618060ed621c3503 Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.119568 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.130245 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.294664 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.300350 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.306024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.306180 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.306226 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:45.306212131 +0000 UTC m=+975.411241980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: W0319 19:12:43.318588 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd565f597_d2c9_45cf_84fd_81986897c7ec.slice/crio-474dc7c2fed538f60461119afc3cb26ed03de6b1a8f0a91c2507eafa74859957 WatchSource:0}: Error finding container 474dc7c2fed538f60461119afc3cb26ed03de6b1a8f0a91c2507eafa74859957: Status 404 returned error can't find the container with id 474dc7c2fed538f60461119afc3cb26ed03de6b1a8f0a91c2507eafa74859957 Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.324524 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.331246 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4nfws"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.338480 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.346647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.354266 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.364616 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2"] Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.365780 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qzv6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-dfthw_openstack-operators(b34dd812-d36d-4bfb-93df-03caacef3d64): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.366722 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l"] Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.367372 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" podUID="b34dd812-d36d-4bfb-93df-03caacef3d64" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.368768 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hqkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6c5c766d94-gvt8l_openstack-operators(2f3d3208-9746-409f-95e9-7ada3c61671d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.368823 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrhc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-h4rn2_openstack-operators(344bdd4e-7b37-4bd7-b403-58a5aa242946): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.370037 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" podUID="344bdd4e-7b37-4bd7-b403-58a5aa242946" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.370078 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" podUID="2f3d3208-9746-409f-95e9-7ada3c61671d" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.508347 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.508528 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.508621 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:45.508599176 +0000 UTC m=+975.613629085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.569738 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" event={"ID":"d565f597-d2c9-45cf-84fd-81986897c7ec","Type":"ContainerStarted","Data":"474dc7c2fed538f60461119afc3cb26ed03de6b1a8f0a91c2507eafa74859957"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.572790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zczlg"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.574698 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" event={"ID":"344bdd4e-7b37-4bd7-b403-58a5aa242946","Type":"ContainerStarted","Data":"2261522dab8a6e4101b759c030c3277489397a87c02528f1dea66c7b327d4e09"} Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.575901 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" podUID="344bdd4e-7b37-4bd7-b403-58a5aa242946" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.576356 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" event={"ID":"1d6c9166-95cd-42d6-9782-56e8879c0412","Type":"ContainerStarted","Data":"60e5c5c5e55a07d0a6770e11b929d8b2a7ccd7179573e1d36600fc3ac4a95685"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.581770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" event={"ID":"58ad3c6c-eec2-41dd-98ac-0ff2454ba608","Type":"ContainerStarted","Data":"b7e9c972fd9f3ccdb2052788cf2db976e7812c91153b07a0d88095f07cf41301"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.589591 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" event={"ID":"b34dd812-d36d-4bfb-93df-03caacef3d64","Type":"ContainerStarted","Data":"9511dd0ce90e61220cdc1d9d20f1c736183084e5bfd67be4e50e92492df36b52"} Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.600877 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" podUID="b34dd812-d36d-4bfb-93df-03caacef3d64" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.602695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" event={"ID":"959b54c4-e249-46da-a57f-e997e6944147","Type":"ContainerStarted","Data":"8c2c6d9949681cf63c57c4625100edb40d724376e732f0a7e9db6c3ed2878f85"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.604623 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" event={"ID":"98807d9f-1747-4534-9727-57a6d81775b6","Type":"ContainerStarted","Data":"5db97b101a3b31497d2af7035d8a4e3556f9100e7ef4b542618060ed621c3503"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.616205 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" event={"ID":"80d312b1-fa3e-4746-baaf-1aa74b1a6a46","Type":"ContainerStarted","Data":"0002c9bcc21d6d3f2fd5cc1ebd53205263bb61595a254fae1ca8f284687cb3ee"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.617313 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g"] Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.617775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" event={"ID":"336483e6-616b-4489-bd29-5f7b62ef0d45","Type":"ContainerStarted","Data":"9e1caebb4df38411aff227fa7cc3e8daad899c2add869333975d3fed59b6e764"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.619604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" event={"ID":"f160f298-8509-46b1-865f-7241c0dd299f","Type":"ContainerStarted","Data":"d7a78ee996684f1d831b303a5819400405cd8fcf3f360235264363926a714d41"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.621989 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" event={"ID":"4e5bdc34-6776-4463-badb-8666194c5f89","Type":"ContainerStarted","Data":"c9f5ec23fa2c4da1674c5b8b7030c14ca3218474d4be694ca4e84a355caa8a49"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.625124 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" event={"ID":"e7453f1e-da50-4386-ac0d-64d309e192b0","Type":"ContainerStarted","Data":"d4c8563531200efbf6976e27eeae8bb247a6294372ba7c4afa489d9d8d55be0e"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.628949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" event={"ID":"22389b01-1060-4ddb-8fc3-3d3faeafefd2","Type":"ContainerStarted","Data":"b64182b6f619a26b14708887ed07a76fbb8253cd4c8dc4a06e3639898d325782"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.630401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" event={"ID":"c8a93f31-4443-4247-b4de-d0bc1e26c6f8","Type":"ContainerStarted","Data":"24ebbf3b23a7628985436a19390446022b6877f880bc48aebea542a2836b0ad1"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.631585 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" event={"ID":"2f3d3208-9746-409f-95e9-7ada3c61671d","Type":"ContainerStarted","Data":"4fe52435e09254bf3727caf1515a0d8ac030514cc0e97e9468a50547964c35d3"} Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.633133 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw"] Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.634655 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" podUID="2f3d3208-9746-409f-95e9-7ada3c61671d" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.650040 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5"] Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.652805 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ssw5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-8ggqw_openstack-operators(fe433b1b-c379-493b-8e5a-74dff21a208d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.654253 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" podUID="fe433b1b-c379-493b-8e5a-74dff21a208d" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.665726 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hhwzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-87zz5_openstack-operators(16cc55d4-44b6-4e27-9afd-484d4db42d1a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.667959 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" podUID="16cc55d4-44b6-4e27-9afd-484d4db42d1a" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.925646 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:43 crc kubenswrapper[5033]: I0319 19:12:43.925703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.925872 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.925924 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:45.925909928 +0000 UTC m=+976.030939777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.925998 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:43 crc kubenswrapper[5033]: E0319 19:12:43.926018 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:45.926012021 +0000 UTC m=+976.031041870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:44 crc kubenswrapper[5033]: I0319 19:12:44.640521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" event={"ID":"fe433b1b-c379-493b-8e5a-74dff21a208d","Type":"ContainerStarted","Data":"0356b576e95c8da5a49ee7a56cffc4dcfaafdfde581f09e0d9fe674eb78f5277"} Mar 19 19:12:44 crc kubenswrapper[5033]: E0319 19:12:44.642422 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" podUID="fe433b1b-c379-493b-8e5a-74dff21a208d" Mar 19 19:12:44 crc kubenswrapper[5033]: I0319 19:12:44.643620 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" event={"ID":"16cc55d4-44b6-4e27-9afd-484d4db42d1a","Type":"ContainerStarted","Data":"3dbb824e4498ac0b15803a9e8ab96fb5d54b19c075ec544a1869c05f1af48b61"} Mar 19 19:12:44 crc kubenswrapper[5033]: E0319 19:12:44.646025 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" podUID="16cc55d4-44b6-4e27-9afd-484d4db42d1a" Mar 19 19:12:44 crc kubenswrapper[5033]: I0319 19:12:44.647580 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" event={"ID":"668a0189-af32-4674-8c48-101dac5c1e55","Type":"ContainerStarted","Data":"aeb28714550f0143661e7ace748651b11d027fdf599182f9191977058e9e43b5"} Mar 19 19:12:44 crc kubenswrapper[5033]: I0319 19:12:44.649768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" event={"ID":"ae6d87d9-9ab4-4a9c-84f4-e5913246875e","Type":"ContainerStarted","Data":"55385de36cf5642efe203db2cc1369e0e61f39af5d5e4d8e8524adeb7949c347"} Mar 19 19:12:44 crc kubenswrapper[5033]: E0319 19:12:44.650991 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" podUID="2f3d3208-9746-409f-95e9-7ada3c61671d" Mar 19 19:12:44 crc kubenswrapper[5033]: E0319 19:12:44.651060 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" podUID="b34dd812-d36d-4bfb-93df-03caacef3d64" Mar 19 19:12:44 crc kubenswrapper[5033]: E0319 19:12:44.652005 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" podUID="344bdd4e-7b37-4bd7-b403-58a5aa242946" Mar 19 19:12:45 crc kubenswrapper[5033]: I0319 19:12:45.354644 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.354761 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.354835 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:49.354819602 +0000 UTC m=+979.459849441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: I0319 19:12:45.557542 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.557744 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.557830 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:49.557812073 +0000 UTC m=+979.662841922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.658037 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" podUID="16cc55d4-44b6-4e27-9afd-484d4db42d1a" Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.658054 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" podUID="fe433b1b-c379-493b-8e5a-74dff21a208d" Mar 19 19:12:45 crc kubenswrapper[5033]: I0319 19:12:45.962778 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:45 crc kubenswrapper[5033]: I0319 19:12:45.962858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.963270 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.963289 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.963363 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:49.963345033 +0000 UTC m=+980.068374882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:45 crc kubenswrapper[5033]: E0319 19:12:45.964161 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:49.963373114 +0000 UTC m=+980.068402963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:49 crc kubenswrapper[5033]: I0319 19:12:49.418619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:49 crc kubenswrapper[5033]: E0319 19:12:49.418860 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:49 crc kubenswrapper[5033]: E0319 19:12:49.419083 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:57.419055604 +0000 UTC m=+987.524085453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:49 crc kubenswrapper[5033]: I0319 19:12:49.620811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:49 crc kubenswrapper[5033]: E0319 19:12:49.620960 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:49 crc kubenswrapper[5033]: E0319 19:12:49.621209 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:57.621195952 +0000 UTC m=+987.726225801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:50 crc kubenswrapper[5033]: I0319 19:12:50.026620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:50 crc kubenswrapper[5033]: I0319 19:12:50.026756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:50 crc kubenswrapper[5033]: E0319 19:12:50.026815 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:50 crc kubenswrapper[5033]: E0319 19:12:50.026882 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:50 crc kubenswrapper[5033]: E0319 19:12:50.026886 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:58.026869356 +0000 UTC m=+988.131899205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:50 crc kubenswrapper[5033]: E0319 19:12:50.026937 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:12:58.026923488 +0000 UTC m=+988.131953337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.895421 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.897253 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.932060 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.960622 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2l6g\" (UniqueName: \"kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.960664 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:51 crc kubenswrapper[5033]: I0319 19:12:51.960785 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.062406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.062852 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2l6g\" (UniqueName: \"kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.062882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.063074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.063371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.097575 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2l6g\" (UniqueName: \"kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g\") pod \"certified-operators-nhjvw\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:52 crc kubenswrapper[5033]: I0319 19:12:52.222892 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:12:56 crc kubenswrapper[5033]: E0319 19:12:56.802633 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 19 19:12:56 crc kubenswrapper[5033]: E0319 19:12:56.803104 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xx2jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4cn7g_openstack-operators(668a0189-af32-4674-8c48-101dac5c1e55): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:12:56 crc kubenswrapper[5033]: E0319 19:12:56.804360 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" podUID="668a0189-af32-4674-8c48-101dac5c1e55" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.325615 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:12:57 crc kubenswrapper[5033]: W0319 19:12:57.347147 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fc919d_6163_4e93_adec_22070d22c7df.slice/crio-30349395d519a58c5dc1b84409766ffd102d6b0a94284bedb1b09188142ef756 WatchSource:0}: Error finding container 30349395d519a58c5dc1b84409766ffd102d6b0a94284bedb1b09188142ef756: Status 404 returned error can't find the container with id 30349395d519a58c5dc1b84409766ffd102d6b0a94284bedb1b09188142ef756 Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.455812 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:12:57 crc kubenswrapper[5033]: E0319 19:12:57.455990 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:57 crc kubenswrapper[5033]: E0319 19:12:57.456041 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert podName:b5102fa8-cc34-4f68-a4af-f26a243c3238 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:13.456026216 +0000 UTC m=+1003.561056065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert") pod "infra-operator-controller-manager-7b9c774f96-vfg9k" (UID: "b5102fa8-cc34-4f68-a4af-f26a243c3238") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.658244 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:12:57 crc kubenswrapper[5033]: E0319 19:12:57.658367 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:57 crc kubenswrapper[5033]: E0319 19:12:57.658416 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert podName:716e7850-0e5e-4cd7-8de4-1b3b6bd51a16 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:13.6584024 +0000 UTC m=+1003.763432249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-58kzh" (UID: "716e7850-0e5e-4cd7-8de4-1b3b6bd51a16") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.765306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" event={"ID":"c8a93f31-4443-4247-b4de-d0bc1e26c6f8","Type":"ContainerStarted","Data":"58ed9af16a76d456336f3f2a5263b84fbd08e68fa548cd0cb97ca7a3a64871a5"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.765434 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.767087 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" event={"ID":"e7453f1e-da50-4386-ac0d-64d309e192b0","Type":"ContainerStarted","Data":"7e4db93fae4838276aa82da52874d7b39dcd3eacfa18027d28d1d94d3c022994"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.767190 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.771877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" event={"ID":"336483e6-616b-4489-bd29-5f7b62ef0d45","Type":"ContainerStarted","Data":"e07ede3feca65e5f47519576ebc1fffe04fe935419460a4c1e3e9083a3aa348e"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.772054 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.781601 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" event={"ID":"f160f298-8509-46b1-865f-7241c0dd299f","Type":"ContainerStarted","Data":"74260e034e0f60ba7218b92360daaa8c47eae4fb34a82b77ad272ce39786f7a7"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.781686 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.783181 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" event={"ID":"22389b01-1060-4ddb-8fc3-3d3faeafefd2","Type":"ContainerStarted","Data":"e2407dc12b6996c015c0cb124cfdb13927bff557612444d2cf8ba48b313d8915"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.783265 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.787930 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" event={"ID":"58ad3c6c-eec2-41dd-98ac-0ff2454ba608","Type":"ContainerStarted","Data":"cfb84ccef87fee3251155d51acafe8ce79ac9cd8f25edfa34efb18297981e14b"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.788283 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.790496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" event={"ID":"ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74","Type":"ContainerStarted","Data":"f69283bce225c10b779d61a36b457280c12893ed027117b795d9d9300133735c"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.790691 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.796676 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" event={"ID":"1d6c9166-95cd-42d6-9782-56e8879c0412","Type":"ContainerStarted","Data":"1832f43ce207463e79335c90a7e114241b49180877dba59e6a37a4c6340823dc"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.796953 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.811860 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" event={"ID":"959b54c4-e249-46da-a57f-e997e6944147","Type":"ContainerStarted","Data":"da23b41613b5236af15c48e6d1a96920e25fa40d818d873302133c41419e16de"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.813021 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.818330 5033 generic.go:334] "Generic (PLEG): container finished" podID="01fc919d-6163-4e93-adec-22070d22c7df" containerID="6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1" exitCode=0 Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.818457 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerDied","Data":"6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.818481 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerStarted","Data":"30349395d519a58c5dc1b84409766ffd102d6b0a94284bedb1b09188142ef756"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.822315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" event={"ID":"4e5bdc34-6776-4463-badb-8666194c5f89","Type":"ContainerStarted","Data":"b757864341cd82c49ca004e7795a65a2684e1f349ff1e2ecae01fc937eafd274"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.822538 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.843275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" event={"ID":"d565f597-d2c9-45cf-84fd-81986897c7ec","Type":"ContainerStarted","Data":"867c3958d141b9ec70496593b4550967843baa11c79b5ee1c2f4c3a9ce25335a"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.843791 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.860171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" event={"ID":"ae6d87d9-9ab4-4a9c-84f4-e5913246875e","Type":"ContainerStarted","Data":"84bd7ae10e3ef0d16457530027c8cdeb60f14aee685d83bd80f0f4e19ec8ea7e"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.860924 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.878775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" event={"ID":"98807d9f-1747-4534-9727-57a6d81775b6","Type":"ContainerStarted","Data":"a3542c9c6297119dbe0e2ddb336391120d3c42729bcf797bf970fff84eede6f0"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.879316 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.883308 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" event={"ID":"80d312b1-fa3e-4746-baaf-1aa74b1a6a46","Type":"ContainerStarted","Data":"a10b351cd2965705a33e2ec85280164b32554b0d865a62cae3d295c872682dc1"} Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.883338 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:12:57 crc kubenswrapper[5033]: E0319 19:12:57.883884 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" podUID="668a0189-af32-4674-8c48-101dac5c1e55" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.893387 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" podStartSLOduration=3.437005108 podStartE2EDuration="16.893357781s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.322833499 +0000 UTC m=+973.427863348" lastFinishedPulling="2026-03-19 19:12:56.779186162 +0000 UTC m=+986.884216021" observedRunningTime="2026-03-19 19:12:57.892637791 +0000 UTC m=+987.997667640" watchObservedRunningTime="2026-03-19 19:12:57.893357781 +0000 UTC m=+987.998387630" Mar 19 19:12:57 crc kubenswrapper[5033]: I0319 19:12:57.985045 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" podStartSLOduration=3.765968624 podStartE2EDuration="16.98502421s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.602293452 +0000 UTC m=+973.707323301" lastFinishedPulling="2026-03-19 19:12:56.821349038 +0000 UTC m=+986.926378887" observedRunningTime="2026-03-19 19:12:57.982462748 +0000 UTC m=+988.087492597" watchObservedRunningTime="2026-03-19 19:12:57.98502421 +0000 UTC m=+988.090054059" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.067565 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.067733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:12:58 crc kubenswrapper[5033]: E0319 19:12:58.067740 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:12:58 crc kubenswrapper[5033]: E0319 19:12:58.067822 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:14.067803089 +0000 UTC m=+1004.172832988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "webhook-server-cert" not found Mar 19 19:12:58 crc kubenswrapper[5033]: E0319 19:12:58.067854 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:12:58 crc kubenswrapper[5033]: E0319 19:12:58.067899 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs podName:6a3c6b85-334f-431c-8840-ca3cf37451a9 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:14.067886452 +0000 UTC m=+1004.172916301 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-j4xnq" (UID: "6a3c6b85-334f-431c-8840-ca3cf37451a9") : secret "metrics-server-cert" not found Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.272928 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" podStartSLOduration=3.758568276 podStartE2EDuration="17.27290794s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.337359118 +0000 UTC m=+973.442388967" lastFinishedPulling="2026-03-19 19:12:56.851698782 +0000 UTC m=+986.956728631" observedRunningTime="2026-03-19 19:12:58.098506863 +0000 UTC m=+988.203536712" watchObservedRunningTime="2026-03-19 19:12:58.27290794 +0000 UTC m=+988.377937949" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.274241 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" podStartSLOduration=3.841659784 podStartE2EDuration="17.274232307s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.346919587 +0000 UTC m=+973.451949436" lastFinishedPulling="2026-03-19 19:12:56.77949211 +0000 UTC m=+986.884521959" observedRunningTime="2026-03-19 19:12:58.270090621 +0000 UTC m=+988.375120470" watchObservedRunningTime="2026-03-19 19:12:58.274232307 +0000 UTC m=+988.379262156" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.307498 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" podStartSLOduration=3.530497509 podStartE2EDuration="17.307478693s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.001849388 +0000 UTC m=+973.106879227" lastFinishedPulling="2026-03-19 19:12:56.778830552 +0000 UTC m=+986.883860411" observedRunningTime="2026-03-19 19:12:58.305912089 +0000 UTC m=+988.410941938" watchObservedRunningTime="2026-03-19 19:12:58.307478693 +0000 UTC m=+988.412508542" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.365819 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" podStartSLOduration=3.118493626 podStartE2EDuration="17.365802354s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:42.579108053 +0000 UTC m=+972.684137902" lastFinishedPulling="2026-03-19 19:12:56.826416781 +0000 UTC m=+986.931446630" observedRunningTime="2026-03-19 19:12:58.364502347 +0000 UTC m=+988.469532186" watchObservedRunningTime="2026-03-19 19:12:58.365802354 +0000 UTC m=+988.470832203" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.425076 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" podStartSLOduration=3.747394651 podStartE2EDuration="17.425056521s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.130020044 +0000 UTC m=+973.235049893" lastFinishedPulling="2026-03-19 19:12:56.807681914 +0000 UTC m=+986.912711763" observedRunningTime="2026-03-19 19:12:58.422570921 +0000 UTC m=+988.527600770" watchObservedRunningTime="2026-03-19 19:12:58.425056521 +0000 UTC m=+988.530086370" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.467069 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" podStartSLOduration=3.980321685 podStartE2EDuration="17.467050283s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.336279247 +0000 UTC m=+973.441309096" lastFinishedPulling="2026-03-19 19:12:56.823007845 +0000 UTC m=+986.928037694" observedRunningTime="2026-03-19 19:12:58.460122098 +0000 UTC m=+988.565151947" watchObservedRunningTime="2026-03-19 19:12:58.467050283 +0000 UTC m=+988.572080132" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.503227 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" podStartSLOduration=4.105160928 podStartE2EDuration="17.50321008s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.00049641 +0000 UTC m=+973.105526259" lastFinishedPulling="2026-03-19 19:12:56.398545552 +0000 UTC m=+986.503575411" observedRunningTime="2026-03-19 19:12:58.49930969 +0000 UTC m=+988.604339539" watchObservedRunningTime="2026-03-19 19:12:58.50321008 +0000 UTC m=+988.608239929" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.553865 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" podStartSLOduration=3.228987836 podStartE2EDuration="17.553833605s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:42.453983833 +0000 UTC m=+972.559013682" lastFinishedPulling="2026-03-19 19:12:56.778829592 +0000 UTC m=+986.883859451" observedRunningTime="2026-03-19 19:12:58.548195256 +0000 UTC m=+988.653225105" watchObservedRunningTime="2026-03-19 19:12:58.553833605 +0000 UTC m=+988.658863454" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.588401 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" podStartSLOduration=4.106382513 podStartE2EDuration="17.588381397s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.337355378 +0000 UTC m=+973.442385227" lastFinishedPulling="2026-03-19 19:12:56.819354262 +0000 UTC m=+986.924384111" observedRunningTime="2026-03-19 19:12:58.585221248 +0000 UTC m=+988.690251097" watchObservedRunningTime="2026-03-19 19:12:58.588381397 +0000 UTC m=+988.693411246" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.628597 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" podStartSLOduration=4.1499738090000005 podStartE2EDuration="17.628580768s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.337039519 +0000 UTC m=+973.442069368" lastFinishedPulling="2026-03-19 19:12:56.815646478 +0000 UTC m=+986.920676327" observedRunningTime="2026-03-19 19:12:58.627135787 +0000 UTC m=+988.732165626" watchObservedRunningTime="2026-03-19 19:12:58.628580768 +0000 UTC m=+988.733610617" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.654907 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" podStartSLOduration=4.004047053 podStartE2EDuration="17.654889378s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.128631995 +0000 UTC m=+973.233661844" lastFinishedPulling="2026-03-19 19:12:56.77947431 +0000 UTC m=+986.884504169" observedRunningTime="2026-03-19 19:12:58.649476836 +0000 UTC m=+988.754506685" watchObservedRunningTime="2026-03-19 19:12:58.654889378 +0000 UTC m=+988.759919227" Mar 19 19:12:58 crc kubenswrapper[5033]: I0319 19:12:58.683711 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" podStartSLOduration=3.984375189 podStartE2EDuration="17.683694258s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.116620917 +0000 UTC m=+973.221650756" lastFinishedPulling="2026-03-19 19:12:56.815939976 +0000 UTC m=+986.920969825" observedRunningTime="2026-03-19 19:12:58.682267748 +0000 UTC m=+988.787297597" watchObservedRunningTime="2026-03-19 19:12:58.683694258 +0000 UTC m=+988.788724107" Mar 19 19:12:59 crc kubenswrapper[5033]: I0319 19:12:59.900866 5033 generic.go:334] "Generic (PLEG): container finished" podID="01fc919d-6163-4e93-adec-22070d22c7df" containerID="64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6" exitCode=0 Mar 19 19:12:59 crc kubenswrapper[5033]: I0319 19:12:59.900942 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerDied","Data":"64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6"} Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.052265 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-hn4dm" Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.070118 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-n8ssn" Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.191648 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dgp4k" Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.308883 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-w2pfb" Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.319944 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-n9lpt" Mar 19 19:13:02 crc kubenswrapper[5033]: I0319 19:13:02.358038 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zczlg" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.972970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" event={"ID":"344bdd4e-7b37-4bd7-b403-58a5aa242946","Type":"ContainerStarted","Data":"cd509367c5634c7c2d5cccd1758b69bc97f4a697b8fce52b7ec1fd40bc8a8dce"} Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.975482 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.977136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" event={"ID":"fe433b1b-c379-493b-8e5a-74dff21a208d","Type":"ContainerStarted","Data":"ab30fd900c23ca5dfed970024ec005aa91c834a498a366494b3f9feb9e929d0a"} Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.977335 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.980080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" event={"ID":"16cc55d4-44b6-4e27-9afd-484d4db42d1a","Type":"ContainerStarted","Data":"1136059af6cd04d0eae244043df66fc05269bd6812d9f71b0e413701f4d8a008"} Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.980322 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.986921 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" event={"ID":"b34dd812-d36d-4bfb-93df-03caacef3d64","Type":"ContainerStarted","Data":"786f74aae0909918b97f4f5a3ff0412f0524eb155b4c71d957ef4adb4349f455"} Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.987130 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.997178 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" podStartSLOduration=3.247559058 podStartE2EDuration="27.997161372s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.36870949 +0000 UTC m=+973.473739339" lastFinishedPulling="2026-03-19 19:13:08.118311804 +0000 UTC m=+998.223341653" observedRunningTime="2026-03-19 19:13:08.995049082 +0000 UTC m=+999.100078941" watchObservedRunningTime="2026-03-19 19:13:08.997161372 +0000 UTC m=+999.102191231" Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.998280 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" event={"ID":"2f3d3208-9746-409f-95e9-7ada3c61671d","Type":"ContainerStarted","Data":"98972199dcac419053d29c0a57f770cb978e4ac401a2ab1229f38003999be052"} Mar 19 19:13:08 crc kubenswrapper[5033]: I0319 19:13:08.998911 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.007257 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerStarted","Data":"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb"} Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.023439 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" podStartSLOduration=3.528258546 podStartE2EDuration="28.023420471s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.652657589 +0000 UTC m=+973.757687438" lastFinishedPulling="2026-03-19 19:13:08.147819514 +0000 UTC m=+998.252849363" observedRunningTime="2026-03-19 19:13:09.015973351 +0000 UTC m=+999.121003190" watchObservedRunningTime="2026-03-19 19:13:09.023420471 +0000 UTC m=+999.128450320" Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.040092 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" podStartSLOduration=3.557795817 podStartE2EDuration="28.040075099s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.665572083 +0000 UTC m=+973.770601932" lastFinishedPulling="2026-03-19 19:13:08.147851365 +0000 UTC m=+998.252881214" observedRunningTime="2026-03-19 19:13:09.035995925 +0000 UTC m=+999.141025774" watchObservedRunningTime="2026-03-19 19:13:09.040075099 +0000 UTC m=+999.145104948" Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.052689 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" podStartSLOduration=3.257076176 podStartE2EDuration="28.052672174s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.365623563 +0000 UTC m=+973.470653412" lastFinishedPulling="2026-03-19 19:13:08.161219561 +0000 UTC m=+998.266249410" observedRunningTime="2026-03-19 19:13:09.048519067 +0000 UTC m=+999.153548916" watchObservedRunningTime="2026-03-19 19:13:09.052672174 +0000 UTC m=+999.157702023" Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.069066 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nhjvw" podStartSLOduration=7.737091931 podStartE2EDuration="18.069049735s" podCreationTimestamp="2026-03-19 19:12:51 +0000 UTC" firstStartedPulling="2026-03-19 19:12:57.821832768 +0000 UTC m=+987.926862617" lastFinishedPulling="2026-03-19 19:13:08.153790572 +0000 UTC m=+998.258820421" observedRunningTime="2026-03-19 19:13:09.068834058 +0000 UTC m=+999.173863907" watchObservedRunningTime="2026-03-19 19:13:09.069049735 +0000 UTC m=+999.174079584" Mar 19 19:13:09 crc kubenswrapper[5033]: I0319 19:13:09.087579 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" podStartSLOduration=3.307447142 podStartE2EDuration="28.087560195s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.368514964 +0000 UTC m=+973.473544813" lastFinishedPulling="2026-03-19 19:13:08.148628017 +0000 UTC m=+998.253657866" observedRunningTime="2026-03-19 19:13:09.081708771 +0000 UTC m=+999.186738620" watchObservedRunningTime="2026-03-19 19:13:09.087560195 +0000 UTC m=+999.192590044" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.021287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" event={"ID":"668a0189-af32-4674-8c48-101dac5c1e55","Type":"ContainerStarted","Data":"711542bb8cbed227cd27147bf7504c4bffba15fa1a4ac8bb67efccfd007dac25"} Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.038779 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4cn7g" podStartSLOduration=2.592880874 podStartE2EDuration="29.038616071s" podCreationTimestamp="2026-03-19 19:12:42 +0000 UTC" firstStartedPulling="2026-03-19 19:12:43.644611653 +0000 UTC m=+973.749641502" lastFinishedPulling="2026-03-19 19:13:10.09034685 +0000 UTC m=+1000.195376699" observedRunningTime="2026-03-19 19:13:11.03538249 +0000 UTC m=+1001.140412339" watchObservedRunningTime="2026-03-19 19:13:11.038616071 +0000 UTC m=+1001.143645920" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.603402 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-zjvmm" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.623103 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-t7sd4" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.673668 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5rrzc" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.810146 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-xtxhs" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.880280 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-lf4wr" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.902897 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-j8lbc" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.972842 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4nfws" Mar 19 19:13:11 crc kubenswrapper[5033]: I0319 19:13:11.976351 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-r8r84" Mar 19 19:13:12 crc kubenswrapper[5033]: I0319 19:13:12.223774 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:12 crc kubenswrapper[5033]: I0319 19:13:12.223857 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:12 crc kubenswrapper[5033]: I0319 19:13:12.266154 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.075093 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.127542 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.523233 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.528671 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5102fa8-cc34-4f68-a4af-f26a243c3238-cert\") pod \"infra-operator-controller-manager-7b9c774f96-vfg9k\" (UID: \"b5102fa8-cc34-4f68-a4af-f26a243c3238\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.624776 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.728257 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.735564 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/716e7850-0e5e-4cd7-8de4-1b3b6bd51a16-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-58kzh\" (UID: \"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:13:13 crc kubenswrapper[5033]: I0319 19:13:13.935619 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.025488 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k"] Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.041302 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" event={"ID":"b5102fa8-cc34-4f68-a4af-f26a243c3238","Type":"ContainerStarted","Data":"8c8960db65c5e1d4d58893f5b41b0854031456960db6a75717802e48796a3b85"} Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.132292 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.132513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.135851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.136010 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a3c6b85-334f-431c-8840-ca3cf37451a9-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-j4xnq\" (UID: \"6a3c6b85-334f-431c-8840-ca3cf37451a9\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.347872 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.361742 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh"] Mar 19 19:13:14 crc kubenswrapper[5033]: I0319 19:13:14.773982 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq"] Mar 19 19:13:14 crc kubenswrapper[5033]: W0319 19:13:14.782786 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3c6b85_334f_431c_8840_ca3cf37451a9.slice/crio-7a5af61d566d6205edbd454aa71faf08e9e210f71f261f400dd22c9be3cd93fe WatchSource:0}: Error finding container 7a5af61d566d6205edbd454aa71faf08e9e210f71f261f400dd22c9be3cd93fe: Status 404 returned error can't find the container with id 7a5af61d566d6205edbd454aa71faf08e9e210f71f261f400dd22c9be3cd93fe Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.048190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" event={"ID":"6a3c6b85-334f-431c-8840-ca3cf37451a9","Type":"ContainerStarted","Data":"fa5467597bf0c56248627a7266e301a45c0fe86fdceafb4181b41d47b360a8ad"} Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.048284 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" event={"ID":"6a3c6b85-334f-431c-8840-ca3cf37451a9","Type":"ContainerStarted","Data":"7a5af61d566d6205edbd454aa71faf08e9e210f71f261f400dd22c9be3cd93fe"} Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.048301 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.049237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" event={"ID":"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16","Type":"ContainerStarted","Data":"cbc96148de23cc8a158e220c9f7d7bb0d0dfeefef0ffacd60402ab91aec787de"} Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.049333 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nhjvw" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="registry-server" containerID="cri-o://1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb" gracePeriod=2 Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.076534 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" podStartSLOduration=33.076516013 podStartE2EDuration="33.076516013s" podCreationTimestamp="2026-03-19 19:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:13:15.071281455 +0000 UTC m=+1005.176311304" watchObservedRunningTime="2026-03-19 19:13:15.076516013 +0000 UTC m=+1005.181545862" Mar 19 19:13:15 crc kubenswrapper[5033]: I0319 19:13:15.881179 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.058069 5033 generic.go:334] "Generic (PLEG): container finished" podID="01fc919d-6163-4e93-adec-22070d22c7df" containerID="1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb" exitCode=0 Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.058155 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhjvw" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.058169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerDied","Data":"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb"} Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.058226 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhjvw" event={"ID":"01fc919d-6163-4e93-adec-22070d22c7df","Type":"ContainerDied","Data":"30349395d519a58c5dc1b84409766ffd102d6b0a94284bedb1b09188142ef756"} Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.058251 5033 scope.go:117] "RemoveContainer" containerID="1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.060683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2l6g\" (UniqueName: \"kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g\") pod \"01fc919d-6163-4e93-adec-22070d22c7df\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.060746 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities\") pod \"01fc919d-6163-4e93-adec-22070d22c7df\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.060831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content\") pod \"01fc919d-6163-4e93-adec-22070d22c7df\" (UID: \"01fc919d-6163-4e93-adec-22070d22c7df\") " Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.061822 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities" (OuterVolumeSpecName: "utilities") pod "01fc919d-6163-4e93-adec-22070d22c7df" (UID: "01fc919d-6163-4e93-adec-22070d22c7df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.078616 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g" (OuterVolumeSpecName: "kube-api-access-m2l6g") pod "01fc919d-6163-4e93-adec-22070d22c7df" (UID: "01fc919d-6163-4e93-adec-22070d22c7df"). InnerVolumeSpecName "kube-api-access-m2l6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.118314 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01fc919d-6163-4e93-adec-22070d22c7df" (UID: "01fc919d-6163-4e93-adec-22070d22c7df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.163673 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.163707 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01fc919d-6163-4e93-adec-22070d22c7df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.163720 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2l6g\" (UniqueName: \"kubernetes.io/projected/01fc919d-6163-4e93-adec-22070d22c7df-kube-api-access-m2l6g\") on node \"crc\" DevicePath \"\"" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.392936 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.425835 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nhjvw"] Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.593360 5033 scope.go:117] "RemoveContainer" containerID="64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.639586 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fc919d-6163-4e93-adec-22070d22c7df" path="/var/lib/kubelet/pods/01fc919d-6163-4e93-adec-22070d22c7df/volumes" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.647988 5033 scope.go:117] "RemoveContainer" containerID="6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.675110 5033 scope.go:117] "RemoveContainer" containerID="1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb" Mar 19 19:13:16 crc kubenswrapper[5033]: E0319 19:13:16.675597 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb\": container with ID starting with 1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb not found: ID does not exist" containerID="1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.675639 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb"} err="failed to get container status \"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb\": rpc error: code = NotFound desc = could not find container \"1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb\": container with ID starting with 1bac940bd644cdead02916415375be7938aa5d013c03fef8f8c26b417d4cbecb not found: ID does not exist" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.675665 5033 scope.go:117] "RemoveContainer" containerID="64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6" Mar 19 19:13:16 crc kubenswrapper[5033]: E0319 19:13:16.676080 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6\": container with ID starting with 64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6 not found: ID does not exist" containerID="64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.676109 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6"} err="failed to get container status \"64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6\": rpc error: code = NotFound desc = could not find container \"64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6\": container with ID starting with 64c590d09a1fe6bc53777b039cc1d6b9178d6a2b4eb68968f42a4ad4f4a5eba6 not found: ID does not exist" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.676133 5033 scope.go:117] "RemoveContainer" containerID="6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1" Mar 19 19:13:16 crc kubenswrapper[5033]: E0319 19:13:16.676329 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1\": container with ID starting with 6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1 not found: ID does not exist" containerID="6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1" Mar 19 19:13:16 crc kubenswrapper[5033]: I0319 19:13:16.676346 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1"} err="failed to get container status \"6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1\": rpc error: code = NotFound desc = could not find container \"6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1\": container with ID starting with 6d6cd11a1713102a1fa807978a0c17ba057ba24755ce72f9bd74cc82323d06e1 not found: ID does not exist" Mar 19 19:13:17 crc kubenswrapper[5033]: I0319 19:13:17.066744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" event={"ID":"716e7850-0e5e-4cd7-8de4-1b3b6bd51a16","Type":"ContainerStarted","Data":"52c648c28e8bcaf86c3fe0a61ae1854ab4db3cd98c6498ef35efd6174527daae"} Mar 19 19:13:17 crc kubenswrapper[5033]: I0319 19:13:17.066835 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:13:17 crc kubenswrapper[5033]: I0319 19:13:17.069266 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" event={"ID":"b5102fa8-cc34-4f68-a4af-f26a243c3238","Type":"ContainerStarted","Data":"f9b9851795f7f937e7d61e7df55b05e174c89a26af3963e03f0146f09f8cb8ec"} Mar 19 19:13:17 crc kubenswrapper[5033]: I0319 19:13:17.069401 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:13:17 crc kubenswrapper[5033]: I0319 19:13:17.089846 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" podStartSLOduration=33.81379844 podStartE2EDuration="36.08982659s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:13:14.373314067 +0000 UTC m=+1004.478343916" lastFinishedPulling="2026-03-19 19:13:16.649342227 +0000 UTC m=+1006.754372066" observedRunningTime="2026-03-19 19:13:17.089328756 +0000 UTC m=+1007.194358605" watchObservedRunningTime="2026-03-19 19:13:17.08982659 +0000 UTC m=+1007.194856439" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.101518 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-h4rn2" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.130530 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" podStartSLOduration=39.292634006 podStartE2EDuration="41.130505836s" podCreationTimestamp="2026-03-19 19:12:41 +0000 UTC" firstStartedPulling="2026-03-19 19:13:14.027134958 +0000 UTC m=+1004.132164807" lastFinishedPulling="2026-03-19 19:13:15.865006788 +0000 UTC m=+1005.970036637" observedRunningTime="2026-03-19 19:13:17.114942517 +0000 UTC m=+1007.219972366" watchObservedRunningTime="2026-03-19 19:13:22.130505836 +0000 UTC m=+1012.235535705" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.190353 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dfthw" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.403779 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-gvt8l" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.466636 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8ggqw" Mar 19 19:13:22 crc kubenswrapper[5033]: I0319 19:13:22.503620 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-87zz5" Mar 19 19:13:23 crc kubenswrapper[5033]: I0319 19:13:23.630008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-vfg9k" Mar 19 19:13:23 crc kubenswrapper[5033]: I0319 19:13:23.944878 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-58kzh" Mar 19 19:13:24 crc kubenswrapper[5033]: I0319 19:13:24.356720 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-j4xnq" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.907342 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:13:41 crc kubenswrapper[5033]: E0319 19:13:41.912065 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="registry-server" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.912329 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="registry-server" Mar 19 19:13:41 crc kubenswrapper[5033]: E0319 19:13:41.912429 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="extract-utilities" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.912541 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="extract-utilities" Mar 19 19:13:41 crc kubenswrapper[5033]: E0319 19:13:41.912632 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="extract-content" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.912716 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="extract-content" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.913039 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fc919d-6163-4e93-adec-22070d22c7df" containerName="registry-server" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.914212 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.923079 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-s5d8k" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.923285 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.923414 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.923558 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.928638 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.994990 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:13:41 crc kubenswrapper[5033]: I0319 19:13:41.999151 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.001106 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.035893 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.095564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.095854 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2w4p\" (UniqueName: \"kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.095979 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.096103 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxb7\" (UniqueName: \"kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.096217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.197402 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.197733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2w4p\" (UniqueName: \"kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.197838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.197926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxb7\" (UniqueName: \"kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.198003 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.198439 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.198539 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.198902 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.216121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2w4p\" (UniqueName: \"kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p\") pod \"dnsmasq-dns-675f4bcbfc-6f7xz\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.218652 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxb7\" (UniqueName: \"kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7\") pod \"dnsmasq-dns-78dd6ddcc-lxj6v\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.248300 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.383669 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.674359 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:13:42 crc kubenswrapper[5033]: I0319 19:13:42.797858 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:13:42 crc kubenswrapper[5033]: W0319 19:13:42.800821 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda443183b_201b_4c13_984d_ce72e248a176.slice/crio-54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46 WatchSource:0}: Error finding container 54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46: Status 404 returned error can't find the container with id 54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46 Mar 19 19:13:43 crc kubenswrapper[5033]: I0319 19:13:43.274194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" event={"ID":"a443183b-201b-4c13-984d-ce72e248a176","Type":"ContainerStarted","Data":"54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46"} Mar 19 19:13:43 crc kubenswrapper[5033]: I0319 19:13:43.276022 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" event={"ID":"790aa7fd-a126-47a9-8cab-14b3242f7c59","Type":"ContainerStarted","Data":"0fe3fc878bbb05a139ea7b51a09057108b3f39225d3d17ab9e603a7c14407983"} Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.658760 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.698814 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.699967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.704094 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.840294 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.840393 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.840437 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwwxf\" (UniqueName: \"kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.939111 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.942138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwwxf\" (UniqueName: \"kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.942220 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.942279 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.943220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.944030 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.962988 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.964636 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:44 crc kubenswrapper[5033]: I0319 19:13:44.984183 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.003589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwwxf\" (UniqueName: \"kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf\") pod \"dnsmasq-dns-5ccc8479f9-glns8\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.023535 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.145244 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.145564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.145600 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmpxp\" (UniqueName: \"kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.246791 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.246839 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.246876 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmpxp\" (UniqueName: \"kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.247887 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.247979 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.269263 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmpxp\" (UniqueName: \"kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp\") pod \"dnsmasq-dns-57d769cc4f-q4fv7\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.304819 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.469737 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:13:45 crc kubenswrapper[5033]: W0319 19:13:45.493071 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6862789_4dd7_4159_b6ef_9b0f6e605725.slice/crio-9d751c8febe261b43696644633e45794d1c270ba1ddcebbc9e3264cbd5f43b76 WatchSource:0}: Error finding container 9d751c8febe261b43696644633e45794d1c270ba1ddcebbc9e3264cbd5f43b76: Status 404 returned error can't find the container with id 9d751c8febe261b43696644633e45794d1c270ba1ddcebbc9e3264cbd5f43b76 Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.779232 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.845836 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.847048 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.851703 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.851824 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.851847 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.852034 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.852145 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.852255 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.852427 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xgb79" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.870109 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961173 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961240 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961265 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961291 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961312 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961337 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vggq\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961366 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961394 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961529 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:45 crc kubenswrapper[5033]: I0319 19:13:45.961556 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.062905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.062956 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.062988 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063117 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063140 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063163 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063181 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.063196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vggq\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.065355 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.067653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.069629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.092409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.092717 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.098763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.100284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.100607 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.100629 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c2787f9d79718a101190626a8fac4e044cb69ec532c273cd4d0472dce50cb36/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.101122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.121383 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vggq\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.121406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.136712 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.140644 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155083 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155288 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-77h7k" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155292 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155420 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155512 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155657 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.155778 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.159218 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265431 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265501 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265550 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265598 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265652 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265679 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265708 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7ks\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.265758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.316415 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" event={"ID":"a6862789-4dd7-4159-b6ef-9b0f6e605725","Type":"ContainerStarted","Data":"9d751c8febe261b43696644633e45794d1c270ba1ddcebbc9e3264cbd5f43b76"} Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.324613 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" event={"ID":"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e","Type":"ContainerStarted","Data":"baefc3cc432643abd9d796bccc3be0e91a212f2493d9f9d1f9bba77d063f36e2"} Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.367870 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.367922 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.367949 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.367971 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.367998 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368045 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368080 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.368159 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7ks\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.369501 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.369561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.370252 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.370254 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.371534 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.374533 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.374929 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.375007 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.375025 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb05dd1e0a4f02545007a3008c7d9e3f987cb58a1dd67affb0d78d89041635cc/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.378166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.380697 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.386303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7ks\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.402395 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.424775 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " pod="openstack/rabbitmq-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.469205 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:13:46 crc kubenswrapper[5033]: I0319 19:13:46.499862 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.035642 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:13:47 crc kubenswrapper[5033]: W0319 19:13:47.043022 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd2f356_f4a3_4256_905e_581b33d3a974.slice/crio-f514a423ef066900ac6fdcd134a22ed9d17b28ab9c8f9b8183cb3b96ffdea77c WatchSource:0}: Error finding container f514a423ef066900ac6fdcd134a22ed9d17b28ab9c8f9b8183cb3b96ffdea77c: Status 404 returned error can't find the container with id f514a423ef066900ac6fdcd134a22ed9d17b28ab9c8f9b8183cb3b96ffdea77c Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.093008 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.314546 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.316113 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.317275 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.318134 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.318807 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-vbf8s" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.319372 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.319965 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.324333 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.353988 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerStarted","Data":"f514a423ef066900ac6fdcd134a22ed9d17b28ab9c8f9b8183cb3b96ffdea77c"} Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.386009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.386587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.386650 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hf4d\" (UniqueName: \"kubernetes.io/projected/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kube-api-access-7hf4d\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.386691 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.386919 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.387094 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.387248 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.387283 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hf4d\" (UniqueName: \"kubernetes.io/projected/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kube-api-access-7hf4d\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488248 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488298 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488348 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488369 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.488431 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.497148 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.497190 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb2f1152761fd39a9d7cea28d215439ae4b3475c9353563709f8eeda91188698/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.498572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.502867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.505853 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.510971 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.511005 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac9db1d-1045-42f9-a7af-1c118226d1d2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.511053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hf4d\" (UniqueName: \"kubernetes.io/projected/0ac9db1d-1045-42f9-a7af-1c118226d1d2-kube-api-access-7hf4d\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.512223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac9db1d-1045-42f9-a7af-1c118226d1d2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.535304 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c9a1d0de-42ff-4efe-ad86-b86ce6019122\") pod \"openstack-galera-0\" (UID: \"0ac9db1d-1045-42f9-a7af-1c118226d1d2\") " pod="openstack/openstack-galera-0" Mar 19 19:13:47 crc kubenswrapper[5033]: I0319 19:13:47.650022 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.502008 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.505611 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.508203 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.508591 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.508737 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4dl6k" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.511578 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.514749 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.610816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85185bfa-1205-4129-8f90-55b580fd3939-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.610889 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.610958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.611008 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.611043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tklv\" (UniqueName: \"kubernetes.io/projected/85185bfa-1205-4129-8f90-55b580fd3939-kube-api-access-7tklv\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.611069 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.611096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.611119 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712239 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712305 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tklv\" (UniqueName: \"kubernetes.io/projected/85185bfa-1205-4129-8f90-55b580fd3939-kube-api-access-7tklv\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712334 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712366 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712387 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712413 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712435 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85185bfa-1205-4129-8f90-55b580fd3939-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.712543 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.714203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.715517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.716882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85185bfa-1205-4129-8f90-55b580fd3939-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.717115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85185bfa-1205-4129-8f90-55b580fd3939-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.719882 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.719929 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/39ee6deab349733dcb4df5c8cedaba9322927e4f8277fb8c22d474681e7ad5ba/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.731505 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.732214 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85185bfa-1205-4129-8f90-55b580fd3939-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.745562 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tklv\" (UniqueName: \"kubernetes.io/projected/85185bfa-1205-4129-8f90-55b580fd3939-kube-api-access-7tklv\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.826311 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.827939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.828559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de2bcddc-7409-4f86-8580-9c6965d5559c\") pod \"openstack-cell1-galera-0\" (UID: \"85185bfa-1205-4129-8f90-55b580fd3939\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.848287 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nw677" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.848561 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.850564 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.859968 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.915561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.915615 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8fl\" (UniqueName: \"kubernetes.io/projected/862eb5fe-aecf-465c-a30b-5f9c0477d625-kube-api-access-rx8fl\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.915645 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-kolla-config\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.915683 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-config-data\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:48 crc kubenswrapper[5033]: I0319 19:13:48.915743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.017113 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.017232 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.017280 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8fl\" (UniqueName: \"kubernetes.io/projected/862eb5fe-aecf-465c-a30b-5f9c0477d625-kube-api-access-rx8fl\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.017320 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-kolla-config\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.018138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-kolla-config\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.018526 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-config-data\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.019154 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862eb5fe-aecf-465c-a30b-5f9c0477d625-config-data\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.021730 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.024645 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862eb5fe-aecf-465c-a30b-5f9c0477d625-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.043275 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8fl\" (UniqueName: \"kubernetes.io/projected/862eb5fe-aecf-465c-a30b-5f9c0477d625-kube-api-access-rx8fl\") pod \"memcached-0\" (UID: \"862eb5fe-aecf-465c-a30b-5f9c0477d625\") " pod="openstack/memcached-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.143398 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 19:13:49 crc kubenswrapper[5033]: I0319 19:13:49.184341 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 19:13:50 crc kubenswrapper[5033]: I0319 19:13:50.985543 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:13:50 crc kubenswrapper[5033]: I0319 19:13:50.990242 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:13:50 crc kubenswrapper[5033]: I0319 19:13:50.996703 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8ckk4" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.004315 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.052694 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrkm\" (UniqueName: \"kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm\") pod \"kube-state-metrics-0\" (UID: \"2baa96fb-8508-4335-b43e-4ec2da1af123\") " pod="openstack/kube-state-metrics-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.154321 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrkm\" (UniqueName: \"kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm\") pod \"kube-state-metrics-0\" (UID: \"2baa96fb-8508-4335-b43e-4ec2da1af123\") " pod="openstack/kube-state-metrics-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.172249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrkm\" (UniqueName: \"kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm\") pod \"kube-state-metrics-0\" (UID: \"2baa96fb-8508-4335-b43e-4ec2da1af123\") " pod="openstack/kube-state-metrics-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.306109 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.550065 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.551888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.557315 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.557791 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-bn8zz" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.558271 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.560412 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.560646 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.617777 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660603 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660643 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660670 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660696 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660713 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rww2h\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-kube-api-access-rww2h\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.660761 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rww2h\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-kube-api-access-rww2h\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762579 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762618 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.762665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.763598 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.766369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.766694 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.766787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.766810 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.775001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.788756 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rww2h\" (UniqueName: \"kubernetes.io/projected/1b5fab5b-14ba-4b0a-adb3-f4bad7edac99-kube-api-access-rww2h\") pod \"alertmanager-metric-storage-0\" (UID: \"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:51 crc kubenswrapper[5033]: I0319 19:13:51.937186 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.561188 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.562828 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567283 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567381 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567481 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567296 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-59b22" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567289 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.567885 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574065 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574089 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wl4\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574168 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574181 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574212 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574395 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.574432 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.575945 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.576718 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675510 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675536 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675595 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675728 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wl4\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675785 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.675832 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.680998 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.684802 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.685053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.685297 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.685654 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.689144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.690623 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.691850 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.691891 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/166953177fb20786f8e1d18631ecc7a8cdf1ccf34ca7e3b1bfc1a12ac011aaeb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.692402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.710529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wl4\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.735190 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:52 crc kubenswrapper[5033]: I0319 19:13:52.878955 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:13:54 crc kubenswrapper[5033]: W0319 19:13:54.356507 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee035802_be9d_40dc_9f6c_3cb58bcb13d6.slice/crio-7f5335cab0e0df384bd865e9a0f45ff1e546c5b2da410a11e80b6bc521c9b3e7 WatchSource:0}: Error finding container 7f5335cab0e0df384bd865e9a0f45ff1e546c5b2da410a11e80b6bc521c9b3e7: Status 404 returned error can't find the container with id 7f5335cab0e0df384bd865e9a0f45ff1e546c5b2da410a11e80b6bc521c9b3e7 Mar 19 19:13:54 crc kubenswrapper[5033]: I0319 19:13:54.414824 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerStarted","Data":"7f5335cab0e0df384bd865e9a0f45ff1e546c5b2da410a11e80b6bc521c9b3e7"} Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.027801 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.029314 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.031397 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.031690 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-z58mc" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.033944 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.033997 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.034571 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.079438 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.141022 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8z6ts"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.142168 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.144017 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bxp8s" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.146112 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.146341 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.147423 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-d4vfd"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.149562 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.155952 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.162741 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d4vfd"] Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216497 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216543 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216569 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216595 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-config\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216692 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216716 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dxf\" (UniqueName: \"kubernetes.io/projected/035d8393-f8cc-4c44-b116-245b5e93e70c-kube-api-access-28dxf\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216768 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.216857 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.317963 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28dxf\" (UniqueName: \"kubernetes.io/projected/035d8393-f8cc-4c44-b116-245b5e93e70c-kube-api-access-28dxf\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318012 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-scripts\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318036 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-scripts\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-lib\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318078 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-etc-ovs\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6b22\" (UniqueName: \"kubernetes.io/projected/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-kube-api-access-d6b22\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318116 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318140 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-ovn-controller-tls-certs\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318162 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-log\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318209 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318223 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-run\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318243 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318260 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318277 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-config\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-combined-ca-bundle\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318351 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-log-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318375 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318407 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.318426 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpxj\" (UniqueName: \"kubernetes.io/projected/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-kube-api-access-5mpxj\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.324046 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.326092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-config\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.329772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.332440 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035d8393-f8cc-4c44-b116-245b5e93e70c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.333111 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.333141 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/586af8ea28264018b2163a2a5025d6ae82ed887db438f616b4f7aa452faf3d9a/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.334263 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.339084 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035d8393-f8cc-4c44-b116-245b5e93e70c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.348025 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dxf\" (UniqueName: \"kubernetes.io/projected/035d8393-f8cc-4c44-b116-245b5e93e70c-kube-api-access-28dxf\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.403750 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e5b1d957-3f6e-4b7a-a033-02973a554823\") pod \"ovsdbserver-nb-0\" (UID: \"035d8393-f8cc-4c44-b116-245b5e93e70c\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-combined-ca-bundle\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419478 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-log-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419507 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419538 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpxj\" (UniqueName: \"kubernetes.io/projected/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-kube-api-access-5mpxj\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419566 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-scripts\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-scripts\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-lib\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-etc-ovs\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419637 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6b22\" (UniqueName: \"kubernetes.io/projected/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-kube-api-access-d6b22\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419663 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-ovn-controller-tls-certs\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419681 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-log\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.419724 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-run\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.420206 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-run\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.421000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-lib\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.421144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-log-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.421199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.422255 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-etc-ovs\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.422936 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-var-log\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.423114 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-var-run-ovn\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.423287 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-combined-ca-bundle\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.432293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-scripts\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.434426 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-scripts\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.437594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-ovn-controller-tls-certs\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.438472 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpxj\" (UniqueName: \"kubernetes.io/projected/ebcc8953-fc35-48d7-a3fd-be1a2291c08c-kube-api-access-5mpxj\") pod \"ovn-controller-8z6ts\" (UID: \"ebcc8953-fc35-48d7-a3fd-be1a2291c08c\") " pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.438674 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6b22\" (UniqueName: \"kubernetes.io/projected/7b48e9b9-7f87-4d91-ad8c-eb50df3b6534-kube-api-access-d6b22\") pod \"ovn-controller-ovs-d4vfd\" (UID: \"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534\") " pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.470668 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.478379 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:13:55 crc kubenswrapper[5033]: I0319 19:13:55.689277 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.259879 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.261420 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.263225 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.263695 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-67c8s" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.265107 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.265148 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.265352 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.296513 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.384566 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m7nx\" (UniqueName: \"kubernetes.io/projected/97b0c498-1aed-420e-922d-9d04f4ac6c63-kube-api-access-6m7nx\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.384614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.384653 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.384696 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.384759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.486111 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m7nx\" (UniqueName: \"kubernetes.io/projected/97b0c498-1aed-420e-922d-9d04f4ac6c63-kube-api-access-6m7nx\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.486159 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.486192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.486228 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.486291 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.492217 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.495065 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.495773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.495958 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b0c498-1aed-420e-922d-9d04f4ac6c63-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.513602 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m7nx\" (UniqueName: \"kubernetes.io/projected/97b0c498-1aed-420e-922d-9d04f4ac6c63-kube-api-access-6m7nx\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5\" (UID: \"97b0c498-1aed-420e-922d-9d04f4ac6c63\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.596472 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.649429 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.650464 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.654421 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.654499 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.655689 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.664227 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.738537 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.746108 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.746760 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.751628 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.756764 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793320 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793385 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdl5\" (UniqueName: \"kubernetes.io/projected/58e55e58-fd66-4bac-9461-895c0f713861-kube-api-access-nmdl5\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793481 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793559 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793610 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.793786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900411 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900471 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfztp\" (UniqueName: \"kubernetes.io/projected/1da94e31-ccb7-43e3-a22c-36d9d9a35933-kube-api-access-hfztp\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900550 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900657 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900699 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900718 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900737 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdl5\" (UniqueName: \"kubernetes.io/projected/58e55e58-fd66-4bac-9461-895c0f713861-kube-api-access-nmdl5\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.900758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.902339 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.902367 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e55e58-fd66-4bac-9461-895c0f713861-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.916307 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.918220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.923287 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.923978 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/58e55e58-fd66-4bac-9461-895c0f713861-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.924469 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.934942 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935180 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935257 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935377 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935462 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935560 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935694 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.935786 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-pmtzl" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.936416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdl5\" (UniqueName: \"kubernetes.io/projected/58e55e58-fd66-4bac-9461-895c0f713861-kube-api-access-nmdl5\") pod \"cloudkitty-lokistack-querier-668f98fdd7-t65p9\" (UID: \"58e55e58-fd66-4bac-9461-895c0f713861\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.949927 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.952719 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.962180 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t"] Mar 19 19:13:58 crc kubenswrapper[5033]: I0319 19:13:58.995476 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.001833 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.001878 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.001935 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.001959 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfztp\" (UniqueName: \"kubernetes.io/projected/1da94e31-ccb7-43e3-a22c-36d9d9a35933-kube-api-access-hfztp\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.002019 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.004137 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.016114 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.016882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.018828 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1da94e31-ccb7-43e3-a22c-36d9d9a35933-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.024166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfztp\" (UniqueName: \"kubernetes.io/projected/1da94e31-ccb7-43e3-a22c-36d9d9a35933-kube-api-access-hfztp\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-2pgf8\" (UID: \"1da94e31-ccb7-43e3-a22c-36d9d9a35933\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.078965 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103872 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103892 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qvx2\" (UniqueName: \"kubernetes.io/projected/74fb1224-a73c-47ca-ac3b-d23ed2116a84-kube-api-access-2qvx2\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103915 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.103986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104003 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104023 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104044 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104066 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104093 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104139 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104156 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104178 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52pj\" (UniqueName: \"kubernetes.io/projected/4b6b27fd-56c6-4473-be2b-eb469e816a08-kube-api-access-m52pj\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.104231 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.205865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.205898 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.205930 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.205955 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.205986 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52pj\" (UniqueName: \"kubernetes.io/projected/4b6b27fd-56c6-4473-be2b-eb469e816a08-kube-api-access-m52pj\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206058 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206076 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qvx2\" (UniqueName: \"kubernetes.io/projected/74fb1224-a73c-47ca-ac3b-d23ed2116a84-kube-api-access-2qvx2\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206091 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206117 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206147 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206198 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206214 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206239 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.206268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.207095 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.207959 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.208504 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.208824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.209423 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.209847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.210217 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.210767 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.211311 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.213377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.213493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/4b6b27fd-56c6-4473-be2b-eb469e816a08-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.213997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/4b6b27fd-56c6-4473-be2b-eb469e816a08-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.214399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/74fb1224-a73c-47ca-ac3b-d23ed2116a84-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.217767 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.218348 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.223675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/74fb1224-a73c-47ca-ac3b-d23ed2116a84-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.223897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qvx2\" (UniqueName: \"kubernetes.io/projected/74fb1224-a73c-47ca-ac3b-d23ed2116a84-kube-api-access-2qvx2\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9\" (UID: \"74fb1224-a73c-47ca-ac3b-d23ed2116a84\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.227179 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52pj\" (UniqueName: \"kubernetes.io/projected/4b6b27fd-56c6-4473-be2b-eb469e816a08-kube-api-access-m52pj\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t\" (UID: \"4b6b27fd-56c6-4473-be2b-eb469e816a08\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.304899 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.347825 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.349359 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.355172 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.355198 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.355282 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-v6jm4" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.355331 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.358810 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.366167 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.471608 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.472918 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.483855 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.485626 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.493315 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510514 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-config\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510559 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510598 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510684 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krn6v\" (UniqueName: \"kubernetes.io/projected/314e5a54-3e9c-42ce-807e-f798a2ab66f9-kube-api-access-krn6v\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510710 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.510794 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-config\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612596 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612621 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612688 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612737 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612767 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krn6v\" (UniqueName: \"kubernetes.io/projected/314e5a54-3e9c-42ce-807e-f798a2ab66f9-kube-api-access-krn6v\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612787 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612817 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612861 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612885 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612903 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2gg\" (UniqueName: \"kubernetes.io/projected/31733aba-46c2-4129-9088-e294daafa285-kube-api-access-bm2gg\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.612926 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.613752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.614129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-config\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.614226 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/314e5a54-3e9c-42ce-807e-f798a2ab66f9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.617357 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.617891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.618061 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.618112 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fa8438418623a61265e5354f382700737e5b5c8f8fcf519f0b4501648027e22a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.625409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314e5a54-3e9c-42ce-807e-f798a2ab66f9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.632121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krn6v\" (UniqueName: \"kubernetes.io/projected/314e5a54-3e9c-42ce-807e-f798a2ab66f9-kube-api-access-krn6v\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.666524 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77ebbe5c-6a9f-4d37-a9df-745e7ce1a1dd\") pod \"ovsdbserver-sb-0\" (UID: \"314e5a54-3e9c-42ce-807e-f798a2ab66f9\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.688215 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714679 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714754 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2gg\" (UniqueName: \"kubernetes.io/projected/31733aba-46c2-4129-9088-e294daafa285-kube-api-access-bm2gg\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714781 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.714915 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.716286 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.717132 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.718103 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.718202 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.723270 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.735732 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.739731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.742026 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.744158 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.744655 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.745575 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.751596 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/31733aba-46c2-4129-9088-e294daafa285-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.752768 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2gg\" (UniqueName: \"kubernetes.io/projected/31733aba-46c2-4129-9088-e294daafa285-kube-api-access-bm2gg\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.753536 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.759350 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"31733aba-46c2-4129-9088-e294daafa285\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.816739 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqgs\" (UniqueName: \"kubernetes.io/projected/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-kube-api-access-mtqgs\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.816829 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.816995 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.817084 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.817120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.817174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.817329 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.837660 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.839745 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.843335 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.843390 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.858585 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.873763 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919328 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919362 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919482 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919526 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919674 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5b6\" (UniqueName: \"kubernetes.io/projected/7574570b-6325-4897-a35c-5712967a74f3-kube-api-access-xt5b6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919857 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.919904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920162 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920195 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920281 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqgs\" (UniqueName: \"kubernetes.io/projected/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-kube-api-access-mtqgs\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.920782 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.922556 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.925490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.927692 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.927700 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.935434 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqgs\" (UniqueName: \"kubernetes.io/projected/fc506eb7-d9c5-4db4-9707-53ff8923ef3b-kube-api-access-mtqgs\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:13:59 crc kubenswrapper[5033]: I0319 19:13:59.944801 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"fc506eb7-d9c5-4db4-9707-53ff8923ef3b\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.021418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.021689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.021777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.021893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5b6\" (UniqueName: \"kubernetes.io/projected/7574570b-6325-4897-a35c-5712967a74f3-kube-api-access-xt5b6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.021988 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.022112 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.022189 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.022420 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.023521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.024967 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7574570b-6325-4897-a35c-5712967a74f3-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.025522 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.026143 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.043594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7574570b-6325-4897-a35c-5712967a74f3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.059351 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.066684 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5b6\" (UniqueName: \"kubernetes.io/projected/7574570b-6325-4897-a35c-5712967a74f3-kube-api-access-xt5b6\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.072529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"7574570b-6325-4897-a35c-5712967a74f3\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.136666 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565794-qn7zn"] Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.137866 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.139598 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.139860 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.146039 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.155115 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-qn7zn"] Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.166077 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.228050 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zcsc\" (UniqueName: \"kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc\") pod \"auto-csr-approver-29565794-qn7zn\" (UID: \"bb876edd-c30e-4253-ac09-9db2e08dc2fc\") " pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.329262 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zcsc\" (UniqueName: \"kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc\") pod \"auto-csr-approver-29565794-qn7zn\" (UID: \"bb876edd-c30e-4253-ac09-9db2e08dc2fc\") " pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.349720 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zcsc\" (UniqueName: \"kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc\") pod \"auto-csr-approver-29565794-qn7zn\" (UID: \"bb876edd-c30e-4253-ac09-9db2e08dc2fc\") " pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:00 crc kubenswrapper[5033]: I0319 19:14:00.457172 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:03 crc kubenswrapper[5033]: I0319 19:14:03.787568 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:14:03 crc kubenswrapper[5033]: I0319 19:14:03.927093 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.797378 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.797560 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2w4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6f7xz_openstack(790aa7fd-a126-47a9-8cab-14b3242f7c59): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.798017 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.798255 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8xxb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lxj6v_openstack(a443183b-201b-4c13-984d-ce72e248a176): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.799355 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" podUID="a443183b-201b-4c13-984d-ce72e248a176" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.799418 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" podUID="790aa7fd-a126-47a9-8cab-14b3242f7c59" Mar 19 19:14:04 crc kubenswrapper[5033]: W0319 19:14:04.829736 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod862eb5fe_aecf_465c_a30b_5f9c0477d625.slice/crio-719980e9270cab55bf5e0f1e88a69472de2c445d194cf6894cdb6febf86bee6f WatchSource:0}: Error finding container 719980e9270cab55bf5e0f1e88a69472de2c445d194cf6894cdb6febf86bee6f: Status 404 returned error can't find the container with id 719980e9270cab55bf5e0f1e88a69472de2c445d194cf6894cdb6febf86bee6f Mar 19 19:14:04 crc kubenswrapper[5033]: W0319 19:14:04.833034 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85185bfa_1205_4129_8f90_55b580fd3939.slice/crio-16c74c99365bec29421d95becab5465a22688f0866a7146541c32941b58d13e4 WatchSource:0}: Error finding container 16c74c99365bec29421d95becab5465a22688f0866a7146541c32941b58d13e4: Status 404 returned error can't find the container with id 16c74c99365bec29421d95becab5465a22688f0866a7146541c32941b58d13e4 Mar 19 19:14:04 crc kubenswrapper[5033]: I0319 19:14:04.835770 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.870431 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.870675 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwwxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-glns8_openstack(a6862789-4dd7-4159-b6ef-9b0f6e605725): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.871812 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.887791 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.887945 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmpxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-q4fv7_openstack(d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:14:04 crc kubenswrapper[5033]: E0319 19:14:04.890489 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" Mar 19 19:14:05 crc kubenswrapper[5033]: I0319 19:14:05.317015 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:14:05 crc kubenswrapper[5033]: I0319 19:14:05.359549 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 19:14:05 crc kubenswrapper[5033]: I0319 19:14:05.399939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:14:05 crc kubenswrapper[5033]: I0319 19:14:05.547267 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85185bfa-1205-4129-8f90-55b580fd3939","Type":"ContainerStarted","Data":"16c74c99365bec29421d95becab5465a22688f0866a7146541c32941b58d13e4"} Mar 19 19:14:05 crc kubenswrapper[5033]: I0319 19:14:05.549225 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"862eb5fe-aecf-465c-a30b-5f9c0477d625","Type":"ContainerStarted","Data":"719980e9270cab55bf5e0f1e88a69472de2c445d194cf6894cdb6febf86bee6f"} Mar 19 19:14:05 crc kubenswrapper[5033]: E0319 19:14:05.551628 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" Mar 19 19:14:05 crc kubenswrapper[5033]: E0319 19:14:05.552579 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" Mar 19 19:14:06 crc kubenswrapper[5033]: W0319 19:14:06.373919 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5fab5b_14ba_4b0a_adb3_f4bad7edac99.slice/crio-461b9deddc6a2e882f43d56e7a91bd1ac0c13c2212899d061527be24062fade7 WatchSource:0}: Error finding container 461b9deddc6a2e882f43d56e7a91bd1ac0c13c2212899d061527be24062fade7: Status 404 returned error can't find the container with id 461b9deddc6a2e882f43d56e7a91bd1ac0c13c2212899d061527be24062fade7 Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.567041 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" event={"ID":"a443183b-201b-4c13-984d-ce72e248a176","Type":"ContainerDied","Data":"54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46"} Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.567394 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54088c13138c714cb95c4d158ba3ae0f615a029b12fe575199238d9573d3da46" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.572441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerStarted","Data":"a311d0dd03776dcff7bf83d71d6264d533e4ef00b6678e96e932188531ec4f5f"} Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.574128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99","Type":"ContainerStarted","Data":"461b9deddc6a2e882f43d56e7a91bd1ac0c13c2212899d061527be24062fade7"} Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.575227 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" event={"ID":"790aa7fd-a126-47a9-8cab-14b3242f7c59","Type":"ContainerDied","Data":"0fe3fc878bbb05a139ea7b51a09057108b3f39225d3d17ab9e603a7c14407983"} Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.575301 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe3fc878bbb05a139ea7b51a09057108b3f39225d3d17ab9e603a7c14407983" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.576838 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2baa96fb-8508-4335-b43e-4ec2da1af123","Type":"ContainerStarted","Data":"4f40039fca189e3cad106d3f5411ce6a56494518e303cd3df586ea625b731893"} Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.714142 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.776695 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config\") pod \"a443183b-201b-4c13-984d-ce72e248a176\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.776778 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxb7\" (UniqueName: \"kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7\") pod \"a443183b-201b-4c13-984d-ce72e248a176\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.776861 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc\") pod \"a443183b-201b-4c13-984d-ce72e248a176\" (UID: \"a443183b-201b-4c13-984d-ce72e248a176\") " Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.777729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a443183b-201b-4c13-984d-ce72e248a176" (UID: "a443183b-201b-4c13-984d-ce72e248a176"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.778506 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config" (OuterVolumeSpecName: "config") pod "a443183b-201b-4c13-984d-ce72e248a176" (UID: "a443183b-201b-4c13-984d-ce72e248a176"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.785563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7" (OuterVolumeSpecName: "kube-api-access-8xxb7") pod "a443183b-201b-4c13-984d-ce72e248a176" (UID: "a443183b-201b-4c13-984d-ce72e248a176"). InnerVolumeSpecName "kube-api-access-8xxb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.807095 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.878653 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2w4p\" (UniqueName: \"kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p\") pod \"790aa7fd-a126-47a9-8cab-14b3242f7c59\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.878874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config\") pod \"790aa7fd-a126-47a9-8cab-14b3242f7c59\" (UID: \"790aa7fd-a126-47a9-8cab-14b3242f7c59\") " Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.879254 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.879272 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxb7\" (UniqueName: \"kubernetes.io/projected/a443183b-201b-4c13-984d-ce72e248a176-kube-api-access-8xxb7\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.879285 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a443183b-201b-4c13-984d-ce72e248a176-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.879752 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config" (OuterVolumeSpecName: "config") pod "790aa7fd-a126-47a9-8cab-14b3242f7c59" (UID: "790aa7fd-a126-47a9-8cab-14b3242f7c59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.881782 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p" (OuterVolumeSpecName: "kube-api-access-q2w4p") pod "790aa7fd-a126-47a9-8cab-14b3242f7c59" (UID: "790aa7fd-a126-47a9-8cab-14b3242f7c59"). InnerVolumeSpecName "kube-api-access-q2w4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.980803 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790aa7fd-a126-47a9-8cab-14b3242f7c59-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:06 crc kubenswrapper[5033]: I0319 19:14:06.981138 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2w4p\" (UniqueName: \"kubernetes.io/projected/790aa7fd-a126-47a9-8cab-14b3242f7c59-kube-api-access-q2w4p\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.310686 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.339983 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.489237 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.571288 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.582841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d4vfd"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.609551 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.616534 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lxj6v" Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.618734 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6f7xz" Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.623325 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.677715 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.708154 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.727078 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.737435 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.745172 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-qn7zn"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.756240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.773976 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.779542 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lxj6v"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.791393 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.797838 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6f7xz"] Mar 19 19:14:07 crc kubenswrapper[5033]: I0319 19:14:07.802958 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:14:08 crc kubenswrapper[5033]: I0319 19:14:08.631612 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790aa7fd-a126-47a9-8cab-14b3242f7c59" path="/var/lib/kubelet/pods/790aa7fd-a126-47a9-8cab-14b3242f7c59/volumes" Mar 19 19:14:08 crc kubenswrapper[5033]: I0319 19:14:08.631989 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a443183b-201b-4c13-984d-ce72e248a176" path="/var/lib/kubelet/pods/a443183b-201b-4c13-984d-ce72e248a176/volumes" Mar 19 19:14:08 crc kubenswrapper[5033]: I0319 19:14:08.632317 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerStarted","Data":"4a087ee37539a6611aa3f7f2252e9f01369edf2d536cb5388780ab9ac0c88afc"} Mar 19 19:14:08 crc kubenswrapper[5033]: I0319 19:14:08.632344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerStarted","Data":"a9c60b46b5a9fb57d068e917fa1f0d393fc4d26de4b8ab813a914a8467d5a93d"} Mar 19 19:14:08 crc kubenswrapper[5033]: W0319 19:14:08.694622 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc506eb7_d9c5_4db4_9707_53ff8923ef3b.slice/crio-85ae6496f5f16715669af9771bb542d5bea47c3a7bb56343019c9e17f3f646fb WatchSource:0}: Error finding container 85ae6496f5f16715669af9771bb542d5bea47c3a7bb56343019c9e17f3f646fb: Status 404 returned error can't find the container with id 85ae6496f5f16715669af9771bb542d5bea47c3a7bb56343019c9e17f3f646fb Mar 19 19:14:08 crc kubenswrapper[5033]: W0319 19:14:08.695233 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb876edd_c30e_4253_ac09_9db2e08dc2fc.slice/crio-044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50 WatchSource:0}: Error finding container 044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50: Status 404 returned error can't find the container with id 044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50 Mar 19 19:14:08 crc kubenswrapper[5033]: W0319 19:14:08.719096 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebcc8953_fc35_48d7_a3fd_be1a2291c08c.slice/crio-6d635c9f025de86fced4e110a31b60e65d9277a0c3a0df0cd5234b05ad9686a2 WatchSource:0}: Error finding container 6d635c9f025de86fced4e110a31b60e65d9277a0c3a0df0cd5234b05ad9686a2: Status 404 returned error can't find the container with id 6d635c9f025de86fced4e110a31b60e65d9277a0c3a0df0cd5234b05ad9686a2 Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.740715 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 19:14:08 crc kubenswrapper[5033]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 19:14:08 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zcsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565794-qn7zn_openshift-infra(bb876edd-c30e-4253-ac09-9db2e08dc2fc): ErrImagePull: pull QPS exceeded Mar 19 19:14:08 crc kubenswrapper[5033]: > logger="UnhandledError" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.742202 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.769037 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-ingester,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=ingester -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:wal,ReadOnly:false,MountPath:/tmp/wal,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bm2gg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-ingester-0_openstack(31733aba-46c2-4129-9088-e294daafa285): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.769070 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-index-gateway,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=index-gateway -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-index-gateway-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xt5b6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-index-gateway-0_openstack(7574570b-6325-4897-a35c-5712967a74f3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.769135 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66fhd6h98h547hfch9fh5b6hd9h88h66dh549h555h5f8h667hdch5fdhbfh594h9bh669h678h64fh66fh567h676h66hfbh5bh587h685h55h5f9q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krn6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(314e5a54-3e9c-42ce-807e-f798a2ab66f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.770225 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="7574570b-6325-4897-a35c-5712967a74f3" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.770212 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.770830 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n66fhd6h98h547hfch9fh5b6hd9h88h66dh549h555h5f8h667hdch5fdhbfh594h9bh669h678h64fh66fh567h676h66hfbh5bh587h685h55h5f9q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krn6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(314e5a54-3e9c-42ce-807e-f798a2ab66f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:08 crc kubenswrapper[5033]: E0319 19:14:08.772427 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack/ovsdbserver-sb-0" podUID="314e5a54-3e9c-42ce-807e-f798a2ab66f9" Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.676764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d4vfd" event={"ID":"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534","Type":"ContainerStarted","Data":"6fa79efd6964534b782fb87728c1d3cf19c501ba47abb6d4b1404faa3b410741"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.678181 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" event={"ID":"97b0c498-1aed-420e-922d-9d04f4ac6c63","Type":"ContainerStarted","Data":"c6be688da2437efb84cf792c4d56930d8063f1c3d2c6811d8e73dbdd94111fe6"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.682620 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"035d8393-f8cc-4c44-b116-245b5e93e70c","Type":"ContainerStarted","Data":"72b8e19a7063b9b5902e89bb6771880afca8ac5c0413fbd11a7895aaea15ae43"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.685997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" event={"ID":"1da94e31-ccb7-43e3-a22c-36d9d9a35933","Type":"ContainerStarted","Data":"9b8cf0329c750f2c1079416fa5f9c08531e4f30a498af2477f63b0ec6f5e2cfc"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.690122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" event={"ID":"58e55e58-fd66-4bac-9461-895c0f713861","Type":"ContainerStarted","Data":"fe5ebcbecf8406401594ea21d1fb4e15707cb8ad568e49a40c2b8027c0bc3397"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.691324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"314e5a54-3e9c-42ce-807e-f798a2ab66f9","Type":"ContainerStarted","Data":"533ead5353630d0bd03621f2b399fd04005d29214c8e4e22d137233a24e90db2"} Mar 19 19:14:09 crc kubenswrapper[5033]: E0319 19:14:09.694674 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-sb-0" podUID="314e5a54-3e9c-42ce-807e-f798a2ab66f9" Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.695565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"31733aba-46c2-4129-9088-e294daafa285","Type":"ContainerStarted","Data":"56211236f227e67a0907567225db75178c457f68f2431b68c1ccba5c84d54824"} Mar 19 19:14:09 crc kubenswrapper[5033]: E0319 19:14:09.696804 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.698533 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" event={"ID":"bb876edd-c30e-4253-ac09-9db2e08dc2fc","Type":"ContainerStarted","Data":"044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.701106 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"fc506eb7-d9c5-4db4-9707-53ff8923ef3b","Type":"ContainerStarted","Data":"85ae6496f5f16715669af9771bb542d5bea47c3a7bb56343019c9e17f3f646fb"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.702257 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" event={"ID":"4b6b27fd-56c6-4473-be2b-eb469e816a08","Type":"ContainerStarted","Data":"a4e9f098ed4acf2baf3d5cbf2e66d763f0d4f4f8c8d622470d78ade2a74e75bc"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.703507 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ac9db1d-1045-42f9-a7af-1c118226d1d2","Type":"ContainerStarted","Data":"93f0eb78ff712b9ccf7da5fb6a9796012533345a03f853372c51a303e490af02"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.707196 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"7574570b-6325-4897-a35c-5712967a74f3","Type":"ContainerStarted","Data":"899b3a8e9df0725debdaa0d3820316dd72916bd44e2a1e351d02e7f2422fb7af"} Mar 19 19:14:09 crc kubenswrapper[5033]: E0319 19:14:09.710415 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="7574570b-6325-4897-a35c-5712967a74f3" Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.711692 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts" event={"ID":"ebcc8953-fc35-48d7-a3fd-be1a2291c08c","Type":"ContainerStarted","Data":"6d635c9f025de86fced4e110a31b60e65d9277a0c3a0df0cd5234b05ad9686a2"} Mar 19 19:14:09 crc kubenswrapper[5033]: I0319 19:14:09.714408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" event={"ID":"74fb1224-a73c-47ca-ac3b-d23ed2116a84","Type":"ContainerStarted","Data":"0e030f61d5f37e3d5666a7833ff1af28a5fc5af1064f028f13f0a9b26dadd6d2"} Mar 19 19:14:10 crc kubenswrapper[5033]: E0319 19:14:10.726694 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-index-gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" podUID="7574570b-6325-4897-a35c-5712967a74f3" Mar 19 19:14:10 crc kubenswrapper[5033]: E0319 19:14:10.726710 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" Mar 19 19:14:10 crc kubenswrapper[5033]: E0319 19:14:10.730639 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-sb-0" podUID="314e5a54-3e9c-42ce-807e-f798a2ab66f9" Mar 19 19:14:10 crc kubenswrapper[5033]: I0319 19:14:10.759016 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:14:10 crc kubenswrapper[5033]: I0319 19:14:10.759087 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:14:11 crc kubenswrapper[5033]: E0319 19:14:11.516941 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" Mar 19 19:14:12 crc kubenswrapper[5033]: E0319 19:14:12.085760 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" Mar 19 19:14:19 crc kubenswrapper[5033]: E0319 19:14:19.554947 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 19 19:14:19 crc kubenswrapper[5033]: E0319 19:14:19.555594 5033 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 19 19:14:19 crc kubenswrapper[5033]: E0319 19:14:19.555749 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(2baa96fb-8508-4335-b43e-4ec2da1af123): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 19:14:19 crc kubenswrapper[5033]: E0319 19:14:19.556926 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" Mar 19 19:14:19 crc kubenswrapper[5033]: E0319 19:14:19.796602 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.810497 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85185bfa-1205-4129-8f90-55b580fd3939","Type":"ContainerStarted","Data":"b63eb97db3cb695dbaeeade51398e131936d3383bee25871cd01307ba8039f95"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.814252 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" event={"ID":"97b0c498-1aed-420e-922d-9d04f4ac6c63","Type":"ContainerStarted","Data":"08abbdce1879e034e4491691ac59e5dd05294029497408a470e8d1a819b5a5b6"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.814861 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.816644 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"862eb5fe-aecf-465c-a30b-5f9c0477d625","Type":"ContainerStarted","Data":"ae9b749451736b54b2abad6d8122e86ac8231bfa3538759fa0027bb1129a5fd1"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.816699 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.817993 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" event={"ID":"4b6b27fd-56c6-4473-be2b-eb469e816a08","Type":"ContainerStarted","Data":"0f259579ad685a92aae203694d4365b53a2aad8b9bdd985b84da835d65fb20be"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.818402 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.819988 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" event={"ID":"1da94e31-ccb7-43e3-a22c-36d9d9a35933","Type":"ContainerStarted","Data":"d03cca7384a1eb9ff742248644b5036ef3c6fedde45c7b92280d76873638f4ed"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.820547 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.822005 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" event={"ID":"58e55e58-fd66-4bac-9461-895c0f713861","Type":"ContainerStarted","Data":"ea0681fc88237ef0276e497ab0f0b86a0cc8e67f1491bc67a58873c4357374cd"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.822533 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.823634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" event={"ID":"74fb1224-a73c-47ca-ac3b-d23ed2116a84","Type":"ContainerStarted","Data":"351dfede43d74aa1904a4bd0da7f212697c0f3b165ac369f685b92463e4c6ff7"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.824149 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.826265 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d4vfd" event={"ID":"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534","Type":"ContainerStarted","Data":"4892b34e97fdf266b7562a0c6c90d0b468f8096d31e777bd79d4eefb24146189"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.828653 5033 generic.go:334] "Generic (PLEG): container finished" podID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerID="eb6e100740a68a5d922a530bbca44098bf56919c104015cef0af0205cb13f2ac" exitCode=0 Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.828983 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" event={"ID":"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e","Type":"ContainerDied","Data":"eb6e100740a68a5d922a530bbca44098bf56919c104015cef0af0205cb13f2ac"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.834757 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ac9db1d-1045-42f9-a7af-1c118226d1d2","Type":"ContainerStarted","Data":"fd087e69925d33e860ca4413dd8a9677b443e362e422d397251ed5100937993d"} Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.852830 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" podStartSLOduration=11.957276103 podStartE2EDuration="23.852811872s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.719989907 +0000 UTC m=+1058.825019756" lastFinishedPulling="2026-03-19 19:14:20.615525676 +0000 UTC m=+1070.720555525" observedRunningTime="2026-03-19 19:14:21.847706959 +0000 UTC m=+1071.952736818" watchObservedRunningTime="2026-03-19 19:14:21.852811872 +0000 UTC m=+1071.957841721" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.858141 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.859679 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.875396 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" podStartSLOduration=11.962238662 podStartE2EDuration="23.875378136s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.73324488 +0000 UTC m=+1058.838274729" lastFinishedPulling="2026-03-19 19:14:20.646384354 +0000 UTC m=+1070.751414203" observedRunningTime="2026-03-19 19:14:21.86768721 +0000 UTC m=+1071.972717059" watchObservedRunningTime="2026-03-19 19:14:21.875378136 +0000 UTC m=+1071.980407985" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.941243 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t" podStartSLOduration=12.027172508 podStartE2EDuration="23.941221017s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.737291344 +0000 UTC m=+1058.842321193" lastFinishedPulling="2026-03-19 19:14:20.651339853 +0000 UTC m=+1070.756369702" observedRunningTime="2026-03-19 19:14:21.940223249 +0000 UTC m=+1072.045253098" watchObservedRunningTime="2026-03-19 19:14:21.941221017 +0000 UTC m=+1072.046250876" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.968247 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=27.239759874 podStartE2EDuration="33.968233186s" podCreationTimestamp="2026-03-19 19:13:48 +0000 UTC" firstStartedPulling="2026-03-19 19:14:04.835502339 +0000 UTC m=+1054.940532198" lastFinishedPulling="2026-03-19 19:14:11.563975661 +0000 UTC m=+1061.669005510" observedRunningTime="2026-03-19 19:14:21.963663348 +0000 UTC m=+1072.068693197" watchObservedRunningTime="2026-03-19 19:14:21.968233186 +0000 UTC m=+1072.073263035" Mar 19 19:14:21 crc kubenswrapper[5033]: I0319 19:14:21.990768 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" podStartSLOduration=12.029626407 podStartE2EDuration="23.990750949s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.683754009 +0000 UTC m=+1058.788783858" lastFinishedPulling="2026-03-19 19:14:20.644878551 +0000 UTC m=+1070.749908400" observedRunningTime="2026-03-19 19:14:21.981959522 +0000 UTC m=+1072.086989371" watchObservedRunningTime="2026-03-19 19:14:21.990750949 +0000 UTC m=+1072.095780798" Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.009647 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" podStartSLOduration=12.198972246 podStartE2EDuration="24.00962847s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.737563241 +0000 UTC m=+1058.842593090" lastFinishedPulling="2026-03-19 19:14:20.548219455 +0000 UTC m=+1070.653249314" observedRunningTime="2026-03-19 19:14:22.003650582 +0000 UTC m=+1072.108680431" watchObservedRunningTime="2026-03-19 19:14:22.00962847 +0000 UTC m=+1072.114658309" Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.846536 5033 generic.go:334] "Generic (PLEG): container finished" podID="7b48e9b9-7f87-4d91-ad8c-eb50df3b6534" containerID="4892b34e97fdf266b7562a0c6c90d0b468f8096d31e777bd79d4eefb24146189" exitCode=0 Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.846713 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d4vfd" event={"ID":"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534","Type":"ContainerDied","Data":"4892b34e97fdf266b7562a0c6c90d0b468f8096d31e777bd79d4eefb24146189"} Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.849055 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"fc506eb7-d9c5-4db4-9707-53ff8923ef3b","Type":"ContainerStarted","Data":"736651d6aa350f7ef87502aec1d461896311961351b12538b6e154f736066727"} Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.849656 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.856394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"035d8393-f8cc-4c44-b116-245b5e93e70c","Type":"ContainerStarted","Data":"c71aaa92fa568d8ee13b85ba3ffecdc03be4c33df24d17a90286acc42e5473b0"} Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.858965 5033 generic.go:334] "Generic (PLEG): container finished" podID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerID="2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494" exitCode=0 Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.859034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" event={"ID":"a6862789-4dd7-4159-b6ef-9b0f6e605725","Type":"ContainerDied","Data":"2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494"} Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.866685 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts" event={"ID":"ebcc8953-fc35-48d7-a3fd-be1a2291c08c","Type":"ContainerStarted","Data":"188d2ee81a4b00f254e7d960ddbfb86fdfa13757454f27e014a16652f6de8f32"} Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.866740 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8z6ts" Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.891976 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=12.940198399 podStartE2EDuration="24.891951298s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.720104761 +0000 UTC m=+1058.825134610" lastFinishedPulling="2026-03-19 19:14:20.67185766 +0000 UTC m=+1070.776887509" observedRunningTime="2026-03-19 19:14:22.883757657 +0000 UTC m=+1072.988787506" watchObservedRunningTime="2026-03-19 19:14:22.891951298 +0000 UTC m=+1072.996981147" Mar 19 19:14:22 crc kubenswrapper[5033]: I0319 19:14:22.924485 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8z6ts" podStartSLOduration=15.978991489 podStartE2EDuration="27.924470162s" podCreationTimestamp="2026-03-19 19:13:55 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.727244461 +0000 UTC m=+1058.832274310" lastFinishedPulling="2026-03-19 19:14:20.672723134 +0000 UTC m=+1070.777752983" observedRunningTime="2026-03-19 19:14:22.922886247 +0000 UTC m=+1073.027916106" watchObservedRunningTime="2026-03-19 19:14:22.924470162 +0000 UTC m=+1073.029500001" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.875264 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" event={"ID":"a6862789-4dd7-4159-b6ef-9b0f6e605725","Type":"ContainerStarted","Data":"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.876605 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.879950 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d4vfd" event={"ID":"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534","Type":"ContainerStarted","Data":"c2ce739d892132d2322b746c87b87f3b3bcaa8237f47e622bb1f3913fed02579"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.879991 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d4vfd" event={"ID":"7b48e9b9-7f87-4d91-ad8c-eb50df3b6534","Type":"ContainerStarted","Data":"007d59ba0f1dc81ffd6e553c3bf88208ad10d6d66b5ce41d3a0eaa5c54bae912"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.880022 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.880053 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.882014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" event={"ID":"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e","Type":"ContainerStarted","Data":"e140b3ba9be2c099b76c8d452ceea9c1abfbe050c2c9f0d6d56de534990855e9"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.882507 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.886030 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerStarted","Data":"7cd9d9fb85ad22b69c81922887eac2450b565e11286fc586b647ddcee1a10ecd"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.888974 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99","Type":"ContainerStarted","Data":"c3e23fb4b4374d60093bd9665ee4d55bb270ec2308caa61b64802221c713d533"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.890426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"7574570b-6325-4897-a35c-5712967a74f3","Type":"ContainerStarted","Data":"5ac0a2ec65500b55606dfb95ebbc650e38a37e8e1287de754191839e1698296d"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.890778 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.892282 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"314e5a54-3e9c-42ce-807e-f798a2ab66f9","Type":"ContainerStarted","Data":"5a0992cead3024cbb0a1bc049985f6cd3e2c3b18fc5cdd4bca98b88a25bec621"} Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.905429 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" podStartSLOduration=4.735217309 podStartE2EDuration="39.905404142s" podCreationTimestamp="2026-03-19 19:13:44 +0000 UTC" firstStartedPulling="2026-03-19 19:13:45.502006136 +0000 UTC m=+1035.607035975" lastFinishedPulling="2026-03-19 19:14:20.672192959 +0000 UTC m=+1070.777222808" observedRunningTime="2026-03-19 19:14:23.898604591 +0000 UTC m=+1074.003634430" watchObservedRunningTime="2026-03-19 19:14:23.905404142 +0000 UTC m=+1074.010433991" Mar 19 19:14:23 crc kubenswrapper[5033]: I0319 19:14:23.957738 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-d4vfd" podStartSLOduration=17.169734856 podStartE2EDuration="28.957718723s" podCreationTimestamp="2026-03-19 19:13:55 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.737000605 +0000 UTC m=+1058.842030454" lastFinishedPulling="2026-03-19 19:14:20.524984452 +0000 UTC m=+1070.630014321" observedRunningTime="2026-03-19 19:14:23.949817571 +0000 UTC m=+1074.054847420" watchObservedRunningTime="2026-03-19 19:14:23.957718723 +0000 UTC m=+1074.062748572" Mar 19 19:14:24 crc kubenswrapper[5033]: I0319 19:14:23.999886 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" podStartSLOduration=5.145333605 podStartE2EDuration="39.999868447s" podCreationTimestamp="2026-03-19 19:13:44 +0000 UTC" firstStartedPulling="2026-03-19 19:13:45.791124831 +0000 UTC m=+1035.896154680" lastFinishedPulling="2026-03-19 19:14:20.645659683 +0000 UTC m=+1070.750689522" observedRunningTime="2026-03-19 19:14:23.991722588 +0000 UTC m=+1074.096752447" watchObservedRunningTime="2026-03-19 19:14:23.999868447 +0000 UTC m=+1074.104898296" Mar 19 19:14:25 crc kubenswrapper[5033]: I0319 19:14:25.906150 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ac9db1d-1045-42f9-a7af-1c118226d1d2" containerID="fd087e69925d33e860ca4413dd8a9677b443e362e422d397251ed5100937993d" exitCode=0 Mar 19 19:14:25 crc kubenswrapper[5033]: I0319 19:14:25.906229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ac9db1d-1045-42f9-a7af-1c118226d1d2","Type":"ContainerDied","Data":"fd087e69925d33e860ca4413dd8a9677b443e362e422d397251ed5100937993d"} Mar 19 19:14:25 crc kubenswrapper[5033]: I0319 19:14:25.908192 5033 generic.go:334] "Generic (PLEG): container finished" podID="85185bfa-1205-4129-8f90-55b580fd3939" containerID="b63eb97db3cb695dbaeeade51398e131936d3383bee25871cd01307ba8039f95" exitCode=0 Mar 19 19:14:25 crc kubenswrapper[5033]: I0319 19:14:25.908233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85185bfa-1205-4129-8f90-55b580fd3939","Type":"ContainerDied","Data":"b63eb97db3cb695dbaeeade51398e131936d3383bee25871cd01307ba8039f95"} Mar 19 19:14:25 crc kubenswrapper[5033]: I0319 19:14:25.936511 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=-9223372008.91828 podStartE2EDuration="27.936495398s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.768926263 +0000 UTC m=+1058.873956112" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:24.017803461 +0000 UTC m=+1074.122833320" watchObservedRunningTime="2026-03-19 19:14:25.936495398 +0000 UTC m=+1076.041525247" Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.932685 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"035d8393-f8cc-4c44-b116-245b5e93e70c","Type":"ContainerStarted","Data":"cc0233138f4a9b30d79da7730ad14ad9f37734247603819424fe8836b7625ed9"} Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.941191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"31733aba-46c2-4129-9088-e294daafa285","Type":"ContainerStarted","Data":"8d3304aca070b758850680d5fa2467f83beafe8797eaa618f081ccb11f236ab1"} Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.941407 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.944913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ac9db1d-1045-42f9-a7af-1c118226d1d2","Type":"ContainerStarted","Data":"482268a685c87301f45ee1227a8f9a2a37d0c9e98f20c758fe448b33889ec3fd"} Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.966962 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.242216186 podStartE2EDuration="34.966943357s" podCreationTimestamp="2026-03-19 19:13:53 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.737574232 +0000 UTC m=+1058.842604081" lastFinishedPulling="2026-03-19 19:14:27.462301413 +0000 UTC m=+1077.567331252" observedRunningTime="2026-03-19 19:14:27.961676039 +0000 UTC m=+1078.066705918" watchObservedRunningTime="2026-03-19 19:14:27.966943357 +0000 UTC m=+1078.071973216" Mar 19 19:14:27 crc kubenswrapper[5033]: I0319 19:14:27.984176 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.203741737 podStartE2EDuration="41.984160661s" podCreationTimestamp="2026-03-19 19:13:46 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.733675412 +0000 UTC m=+1058.838705251" lastFinishedPulling="2026-03-19 19:14:20.514094326 +0000 UTC m=+1070.619124175" observedRunningTime="2026-03-19 19:14:27.983241175 +0000 UTC m=+1078.088271044" watchObservedRunningTime="2026-03-19 19:14:27.984160661 +0000 UTC m=+1078.089190500" Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.008293 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=-9223372006.846512 podStartE2EDuration="30.008264128s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.768884102 +0000 UTC m=+1058.873913951" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:28.004802981 +0000 UTC m=+1078.109832830" watchObservedRunningTime="2026-03-19 19:14:28.008264128 +0000 UTC m=+1078.113293977" Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.694632 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.755096 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.958313 5033 generic.go:334] "Generic (PLEG): container finished" podID="1b5fab5b-14ba-4b0a-adb3-f4bad7edac99" containerID="c3e23fb4b4374d60093bd9665ee4d55bb270ec2308caa61b64802221c713d533" exitCode=0 Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.958404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99","Type":"ContainerDied","Data":"c3e23fb4b4374d60093bd9665ee4d55bb270ec2308caa61b64802221c713d533"} Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.966101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"314e5a54-3e9c-42ce-807e-f798a2ab66f9","Type":"ContainerStarted","Data":"deaaf5882abe56670b7f758a4c1d494b045c1f59874723a1c57c8631c93c3017"} Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.968506 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85185bfa-1205-4129-8f90-55b580fd3939","Type":"ContainerStarted","Data":"64af613e6c082174ba46e99737fa204967ddbf3d708862e99701107d54713a7a"} Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.971851 5033 generic.go:334] "Generic (PLEG): container finished" podID="ffd2aa46-7091-4165-9da4-248c04907907" containerID="7cd9d9fb85ad22b69c81922887eac2450b565e11286fc586b647ddcee1a10ecd" exitCode=0 Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.972422 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerDied","Data":"7cd9d9fb85ad22b69c81922887eac2450b565e11286fc586b647ddcee1a10ecd"} Mar 19 19:14:28 crc kubenswrapper[5033]: I0319 19:14:28.975876 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.015563 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.832041154 podStartE2EDuration="42.015544369s" podCreationTimestamp="2026-03-19 19:13:47 +0000 UTC" firstStartedPulling="2026-03-19 19:14:04.843969377 +0000 UTC m=+1054.948999226" lastFinishedPulling="2026-03-19 19:14:19.027472572 +0000 UTC m=+1069.132502441" observedRunningTime="2026-03-19 19:14:29.011393183 +0000 UTC m=+1079.116423042" watchObservedRunningTime="2026-03-19 19:14:29.015544369 +0000 UTC m=+1079.120574228" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.054629 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.081661 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.386449766 podStartE2EDuration="31.081621657s" podCreationTimestamp="2026-03-19 19:13:58 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.768938563 +0000 UTC m=+1058.873968432" lastFinishedPulling="2026-03-19 19:14:27.464110464 +0000 UTC m=+1077.569140323" observedRunningTime="2026-03-19 19:14:29.07035893 +0000 UTC m=+1079.175388789" watchObservedRunningTime="2026-03-19 19:14:29.081621657 +0000 UTC m=+1079.186651516" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.144344 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.144403 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.186679 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.516716 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.517030 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="dnsmasq-dns" containerID="cri-o://e140b3ba9be2c099b76c8d452ceea9c1abfbe050c2c9f0d6d56de534990855e9" gracePeriod=10 Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.522635 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.563372 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.571651 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.574767 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.653496 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.655604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rxvn\" (UniqueName: \"kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.655731 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.656735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.660991 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.692392 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.693613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.726280 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-f4hlq"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.728805 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.734637 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.746254 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-f4hlq"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.746644 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758205 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovs-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758227 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758255 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013523ec-f077-4920-9b16-018f37cf5ef6-config\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758314 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovn-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rxvn\" (UniqueName: \"kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758371 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5fl\" (UniqueName: \"kubernetes.io/projected/013523ec-f077-4920-9b16-018f37cf5ef6-kube-api-access-sk5fl\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758396 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-combined-ca-bundle\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.758414 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.759549 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.760336 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.771813 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.817101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rxvn\" (UniqueName: \"kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn\") pod \"dnsmasq-dns-7fd796d7df-wfxnw\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5fl\" (UniqueName: \"kubernetes.io/projected/013523ec-f077-4920-9b16-018f37cf5ef6-kube-api-access-sk5fl\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-combined-ca-bundle\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862749 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovs-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013523ec-f077-4920-9b16-018f37cf5ef6-config\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862843 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.862892 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovn-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.863409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovn-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.865110 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/013523ec-f077-4920-9b16-018f37cf5ef6-ovs-rundir\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.865189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013523ec-f077-4920-9b16-018f37cf5ef6-config\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.878060 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.901952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013523ec-f077-4920-9b16-018f37cf5ef6-combined-ca-bundle\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.905553 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.905865 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="dnsmasq-dns" containerID="cri-o://7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7" gracePeriod=10 Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.918944 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.940117 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.942169 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.944700 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5fl\" (UniqueName: \"kubernetes.io/projected/013523ec-f077-4920-9b16-018f37cf5ef6-kube-api-access-sk5fl\") pod \"ovn-controller-metrics-f4hlq\" (UID: \"013523ec-f077-4920-9b16-018f37cf5ef6\") " pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.946589 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.959925 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.966663 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkb6\" (UniqueName: \"kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.966751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.966818 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.966968 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.967043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.985268 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" event={"ID":"bb876edd-c30e-4253-ac09-9db2e08dc2fc","Type":"ContainerStarted","Data":"eb98b9e36ce3a9f4c043b8fd90f8689773a1f355631d734c1ea657fd8ff89f9d"} Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.988308 5033 generic.go:334] "Generic (PLEG): container finished" podID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerID="e140b3ba9be2c099b76c8d452ceea9c1abfbe050c2c9f0d6d56de534990855e9" exitCode=0 Mar 19 19:14:29 crc kubenswrapper[5033]: I0319 19:14:29.989059 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" event={"ID":"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e","Type":"ContainerDied","Data":"e140b3ba9be2c099b76c8d452ceea9c1abfbe050c2c9f0d6d56de534990855e9"} Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.014850 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.018641 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" podStartSLOduration=10.040298225 podStartE2EDuration="30.018616971s" podCreationTimestamp="2026-03-19 19:14:00 +0000 UTC" firstStartedPulling="2026-03-19 19:14:08.740627987 +0000 UTC m=+1058.845657826" lastFinishedPulling="2026-03-19 19:14:28.718946723 +0000 UTC m=+1078.823976572" observedRunningTime="2026-03-19 19:14:30.013647632 +0000 UTC m=+1080.118677481" watchObservedRunningTime="2026-03-19 19:14:30.018616971 +0000 UTC m=+1080.123646810" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.025206 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.044223 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.057963 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-f4hlq" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.070074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkb6\" (UniqueName: \"kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.070179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.070242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.070322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.070368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.071578 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.073020 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.073530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.073694 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.097045 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkb6\" (UniqueName: \"kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6\") pod \"dnsmasq-dns-86db49b7ff-8s55v\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.241601 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.243574 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.250544 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.250615 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rwkj9" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.250830 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.250936 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.258000 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.265401 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.342286 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.383663 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config\") pod \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.383721 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc\") pod \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.383870 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmpxp\" (UniqueName: \"kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp\") pod \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\" (UID: \"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-config\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-scripts\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-452f2\" (UniqueName: \"kubernetes.io/projected/920698c6-4c9d-4e12-bab3-9d7091f02548-kube-api-access-452f2\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384608 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.384636 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.402038 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp" (OuterVolumeSpecName: "kube-api-access-fmpxp") pod "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" (UID: "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e"). InnerVolumeSpecName "kube-api-access-fmpxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.438265 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config" (OuterVolumeSpecName: "config") pod "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" (UID: "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.454282 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" (UID: "d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486336 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486437 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-config\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486465 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-scripts\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486589 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-452f2\" (UniqueName: \"kubernetes.io/projected/920698c6-4c9d-4e12-bab3-9d7091f02548-kube-api-access-452f2\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486667 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486677 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.486688 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmpxp\" (UniqueName: \"kubernetes.io/projected/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e-kube-api-access-fmpxp\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.487944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.499258 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-scripts\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.501871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920698c6-4c9d-4e12-bab3-9d7091f02548-config\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.502559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.502591 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.504375 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/920698c6-4c9d-4e12-bab3-9d7091f02548-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.512248 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-452f2\" (UniqueName: \"kubernetes.io/projected/920698c6-4c9d-4e12-bab3-9d7091f02548-kube-api-access-452f2\") pod \"ovn-northd-0\" (UID: \"920698c6-4c9d-4e12-bab3-9d7091f02548\") " pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.536358 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.603900 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rwkj9" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.611677 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.691631 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwwxf\" (UniqueName: \"kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf\") pod \"a6862789-4dd7-4159-b6ef-9b0f6e605725\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.691818 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") pod \"a6862789-4dd7-4159-b6ef-9b0f6e605725\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.691859 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc\") pod \"a6862789-4dd7-4159-b6ef-9b0f6e605725\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.704719 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf" (OuterVolumeSpecName: "kube-api-access-pwwxf") pod "a6862789-4dd7-4159-b6ef-9b0f6e605725" (UID: "a6862789-4dd7-4159-b6ef-9b0f6e605725"). InnerVolumeSpecName "kube-api-access-pwwxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:30 crc kubenswrapper[5033]: E0319 19:14:30.743996 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config podName:a6862789-4dd7-4159-b6ef-9b0f6e605725 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:31.243967388 +0000 UTC m=+1081.348997237 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config") pod "a6862789-4dd7-4159-b6ef-9b0f6e605725" (UID: "a6862789-4dd7-4159-b6ef-9b0f6e605725") : error deleting /var/lib/kubelet/pods/a6862789-4dd7-4159-b6ef-9b0f6e605725/volume-subpaths: remove /var/lib/kubelet/pods/a6862789-4dd7-4159-b6ef-9b0f6e605725/volume-subpaths: no such file or directory Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.744435 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6862789-4dd7-4159-b6ef-9b0f6e605725" (UID: "a6862789-4dd7-4159-b6ef-9b0f6e605725"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.806626 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.806688 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwwxf\" (UniqueName: \"kubernetes.io/projected/a6862789-4dd7-4159-b6ef-9b0f6e605725-kube-api-access-pwwxf\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.944040 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-f4hlq"] Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.950701 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:30 crc kubenswrapper[5033]: W0319 19:14:30.959992 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013523ec_f077_4920_9b16_018f37cf5ef6.slice/crio-41c38696ce23f8e11b6033b0cc6208664d20a95777d7b2c3e8a1fa4c77c0128b WatchSource:0}: Error finding container 41c38696ce23f8e11b6033b0cc6208664d20a95777d7b2c3e8a1fa4c77c0128b: Status 404 returned error can't find the container with id 41c38696ce23f8e11b6033b0cc6208664d20a95777d7b2c3e8a1fa4c77c0128b Mar 19 19:14:30 crc kubenswrapper[5033]: I0319 19:14:30.980839 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.002302 5033 generic.go:334] "Generic (PLEG): container finished" podID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerID="7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7" exitCode=0 Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.002605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" event={"ID":"a6862789-4dd7-4159-b6ef-9b0f6e605725","Type":"ContainerDied","Data":"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.002688 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.002792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-glns8" event={"ID":"a6862789-4dd7-4159-b6ef-9b0f6e605725","Type":"ContainerDied","Data":"9d751c8febe261b43696644633e45794d1c270ba1ddcebbc9e3264cbd5f43b76"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.002809 5033 scope.go:117] "RemoveContainer" containerID="7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.014635 5033 generic.go:334] "Generic (PLEG): container finished" podID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" containerID="eb98b9e36ce3a9f4c043b8fd90f8689773a1f355631d734c1ea657fd8ff89f9d" exitCode=0 Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.014675 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" event={"ID":"bb876edd-c30e-4253-ac09-9db2e08dc2fc","Type":"ContainerDied","Data":"eb98b9e36ce3a9f4c043b8fd90f8689773a1f355631d734c1ea657fd8ff89f9d"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.019061 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" event={"ID":"d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e","Type":"ContainerDied","Data":"baefc3cc432643abd9d796bccc3be0e91a212f2493d9f9d1f9bba77d063f36e2"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.019162 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q4fv7" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.021485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" event={"ID":"07639eb7-090f-4778-80fe-cc7b077d62b5","Type":"ContainerStarted","Data":"32af449d9dea9af5d6116bf5ea10e805a84b43dd97c3f60e63f6c35937d4912a"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.022690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" event={"ID":"f56c0f10-c865-425e-b89b-d5e885c0fdca","Type":"ContainerStarted","Data":"3e8f1f3562ea355159ac0071b4f741b5eb8c55d2b677adbce88a1d2cd1196411"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.028371 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-f4hlq" event={"ID":"013523ec-f077-4920-9b16-018f37cf5ef6","Type":"ContainerStarted","Data":"41c38696ce23f8e11b6033b0cc6208664d20a95777d7b2c3e8a1fa4c77c0128b"} Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.101766 5033 scope.go:117] "RemoveContainer" containerID="2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.125527 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.139747 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q4fv7"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.152069 5033 scope.go:117] "RemoveContainer" containerID="7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.153003 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.155375 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7\": container with ID starting with 7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7 not found: ID does not exist" containerID="7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.155427 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7"} err="failed to get container status \"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7\": rpc error: code = NotFound desc = could not find container \"7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7\": container with ID starting with 7f433148fa5eb9e92054b8f7f0c660f4f5f3371539e3526f1cf28da4bc8963f7 not found: ID does not exist" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.155469 5033 scope.go:117] "RemoveContainer" containerID="2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494" Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.155727 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494\": container with ID starting with 2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494 not found: ID does not exist" containerID="2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.155751 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494"} err="failed to get container status \"2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494\": rpc error: code = NotFound desc = could not find container \"2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494\": container with ID starting with 2ad161029f3b96e9882b37c1d0c747dcbe49b8b476ee45971da7ee2c7dd9e494 not found: ID does not exist" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.155765 5033 scope.go:117] "RemoveContainer" containerID="e140b3ba9be2c099b76c8d452ceea9c1abfbe050c2c9f0d6d56de534990855e9" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.319753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") pod \"a6862789-4dd7-4159-b6ef-9b0f6e605725\" (UID: \"a6862789-4dd7-4159-b6ef-9b0f6e605725\") " Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.321293 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config" (OuterVolumeSpecName: "config") pod "a6862789-4dd7-4159-b6ef-9b0f6e605725" (UID: "a6862789-4dd7-4159-b6ef-9b0f6e605725"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.332850 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6862789-4dd7-4159-b6ef-9b0f6e605725-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.347895 5033 scope.go:117] "RemoveContainer" containerID="eb6e100740a68a5d922a530bbca44098bf56919c104015cef0af0205cb13f2ac" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.404007 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.436967 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.437315 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.437326 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.437342 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.437348 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.437360 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="init" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.437365 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="init" Mar 19 19:14:31 crc kubenswrapper[5033]: E0319 19:14:31.437387 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="init" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.437393 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="init" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.438324 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.438342 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" containerName="dnsmasq-dns" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.441957 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.472812 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.534958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.535016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.535059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2b6f\" (UniqueName: \"kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.535106 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.535131 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.637039 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2b6f\" (UniqueName: \"kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.637102 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.637129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.637197 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.637225 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.638018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.638773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.639263 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.648809 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.675323 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2b6f\" (UniqueName: \"kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f\") pod \"dnsmasq-dns-698758b865-8955p\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.810684 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.812321 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-glns8"] Mar 19 19:14:31 crc kubenswrapper[5033]: I0319 19:14:31.836966 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.071253 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"920698c6-4c9d-4e12-bab3-9d7091f02548","Type":"ContainerStarted","Data":"e71fbdf4993a585d4065b6d3ac97ca5222b4d23e5034797c898adea8589abc63"} Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.490867 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:14:32 crc kubenswrapper[5033]: W0319 19:14:32.497734 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4998496_39f0_42d7_b0fa_e2caabe7ccad.slice/crio-fbac80e442d9fc7ab400c6a5b2a27b0b03a6e11c9fe0d4ed03e30ba6e8520db4 WatchSource:0}: Error finding container fbac80e442d9fc7ab400c6a5b2a27b0b03a6e11c9fe0d4ed03e30ba6e8520db4: Status 404 returned error can't find the container with id fbac80e442d9fc7ab400c6a5b2a27b0b03a6e11c9fe0d4ed03e30ba6e8520db4 Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.577243 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.651806 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6862789-4dd7-4159-b6ef-9b0f6e605725" path="/var/lib/kubelet/pods/a6862789-4dd7-4159-b6ef-9b0f6e605725/volumes" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.653299 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e" path="/var/lib/kubelet/pods/d502d3c0-8ce1-4e5a-8c2d-6744ca5b285e/volumes" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.687372 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:14:32 crc kubenswrapper[5033]: E0319 19:14:32.687905 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" containerName="oc" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.687924 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" containerName="oc" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.688158 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" containerName="oc" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.696991 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.701783 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5pdtg" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.702731 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.703087 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.703227 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.709084 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.777059 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zcsc\" (UniqueName: \"kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc\") pod \"bb876edd-c30e-4253-ac09-9db2e08dc2fc\" (UID: \"bb876edd-c30e-4253-ac09-9db2e08dc2fc\") " Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.809274 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc" (OuterVolumeSpecName: "kube-api-access-7zcsc") pod "bb876edd-c30e-4253-ac09-9db2e08dc2fc" (UID: "bb876edd-c30e-4253-ac09-9db2e08dc2fc"). InnerVolumeSpecName "kube-api-access-7zcsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnxd\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-kube-api-access-sdnxd\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880755 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880795 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-cache\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880862 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91fda80-4324-4015-a32f-3396d6d2da1d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880935 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.880965 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-lock\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.881080 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zcsc\" (UniqueName: \"kubernetes.io/projected/bb876edd-c30e-4253-ac09-9db2e08dc2fc-kube-api-access-7zcsc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.983816 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnxd\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-kube-api-access-sdnxd\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.983987 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: E0319 19:14:32.984216 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:32 crc kubenswrapper[5033]: E0319 19:14:32.984238 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:32 crc kubenswrapper[5033]: E0319 19:14:32.984326 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:14:33.484296395 +0000 UTC m=+1083.589326264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.984715 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-cache\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.984726 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-cache\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.984838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91fda80-4324-4015-a32f-3396d6d2da1d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.985027 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.985063 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-lock\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.985606 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a91fda80-4324-4015-a32f-3396d6d2da1d-lock\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.989968 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:14:32 crc kubenswrapper[5033]: I0319 19:14:32.990023 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94a49eb4db3a66914bf523510f5071046d35e5cb18f6c39e92bc52bad19205f9/globalmount\"" pod="openstack/swift-storage-0" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.002520 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91fda80-4324-4015-a32f-3396d6d2da1d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.010298 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnxd\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-kube-api-access-sdnxd\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.066017 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ff243d0-2974-40b8-8e76-0d63750ed6c2\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.097748 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-xl8gh"] Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.112595 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-xl8gh"] Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.139215 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-f4hlq" event={"ID":"013523ec-f077-4920-9b16-018f37cf5ef6","Type":"ContainerStarted","Data":"7d95264f75e524ac8978d953255b19477417efc510c08a1116979e15345399b3"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.144390 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2baa96fb-8508-4335-b43e-4ec2da1af123","Type":"ContainerStarted","Data":"8ccb34d2c903bbca2ba09f3303239e9f3a44fc474f9a19c5f0b8409a8ea5ffb4"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.144637 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.154494 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bz7gc"] Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.156935 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.162736 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.162809 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.168470 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.171413 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bz7gc"] Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.181024 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-f4hlq" podStartSLOduration=4.180992603 podStartE2EDuration="4.180992603s" podCreationTimestamp="2026-03-19 19:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:33.167053342 +0000 UTC m=+1083.272083181" watchObservedRunningTime="2026-03-19 19:14:33.180992603 +0000 UTC m=+1083.286022452" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.192738 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.493380267 podStartE2EDuration="43.192719563s" podCreationTimestamp="2026-03-19 19:13:50 +0000 UTC" firstStartedPulling="2026-03-19 19:14:06.418067759 +0000 UTC m=+1056.523097608" lastFinishedPulling="2026-03-19 19:14:31.117407055 +0000 UTC m=+1081.222436904" observedRunningTime="2026-03-19 19:14:33.188413512 +0000 UTC m=+1083.293443361" watchObservedRunningTime="2026-03-19 19:14:33.192719563 +0000 UTC m=+1083.297749412" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.199491 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" event={"ID":"bb876edd-c30e-4253-ac09-9db2e08dc2fc","Type":"ContainerDied","Data":"044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.199541 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044b7337b7791f1698bdd35c212ba6511d85e17f42663a72c5455a1eb863da50" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.199539 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-qn7zn" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.232224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8955p" event={"ID":"e4998496-39f0-42d7-b0fa-e2caabe7ccad","Type":"ContainerStarted","Data":"fbac80e442d9fc7ab400c6a5b2a27b0b03a6e11c9fe0d4ed03e30ba6e8520db4"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.236529 5033 generic.go:334] "Generic (PLEG): container finished" podID="07639eb7-090f-4778-80fe-cc7b077d62b5" containerID="e273d6a459afda743031d9863b3813ce063f270478f23c3432e84fc6b51bb8fc" exitCode=0 Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.236593 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" event={"ID":"07639eb7-090f-4778-80fe-cc7b077d62b5","Type":"ContainerDied","Data":"e273d6a459afda743031d9863b3813ce063f270478f23c3432e84fc6b51bb8fc"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.239384 5033 generic.go:334] "Generic (PLEG): container finished" podID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerID="804e3258f0d1393788163824af8c99f2f5885b99ce19f68906b36322b81fe8ed" exitCode=0 Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.239429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" event={"ID":"f56c0f10-c865-425e-b89b-d5e885c0fdca","Type":"ContainerDied","Data":"804e3258f0d1393788163824af8c99f2f5885b99ce19f68906b36322b81fe8ed"} Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.306573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlqh\" (UniqueName: \"kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.306632 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.306688 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.306898 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.307039 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.307068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.307192 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409372 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409524 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlqh\" (UniqueName: \"kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409603 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409628 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.409710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.412753 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.412771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.413197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.418667 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.420931 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.423615 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.436062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlqh\" (UniqueName: \"kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh\") pod \"swift-ring-rebalance-bz7gc\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.511889 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:33 crc kubenswrapper[5033]: E0319 19:14:33.512148 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:33 crc kubenswrapper[5033]: E0319 19:14:33.512201 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:33 crc kubenswrapper[5033]: E0319 19:14:33.512301 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:14:34.512264894 +0000 UTC m=+1084.617294743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:33 crc kubenswrapper[5033]: I0319 19:14:33.525084 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.040957 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.164152 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc\") pod \"07639eb7-090f-4778-80fe-cc7b077d62b5\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.164247 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb\") pod \"07639eb7-090f-4778-80fe-cc7b077d62b5\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.164309 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config\") pod \"07639eb7-090f-4778-80fe-cc7b077d62b5\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.164420 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rxvn\" (UniqueName: \"kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn\") pod \"07639eb7-090f-4778-80fe-cc7b077d62b5\" (UID: \"07639eb7-090f-4778-80fe-cc7b077d62b5\") " Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.170582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn" (OuterVolumeSpecName: "kube-api-access-6rxvn") pod "07639eb7-090f-4778-80fe-cc7b077d62b5" (UID: "07639eb7-090f-4778-80fe-cc7b077d62b5"). InnerVolumeSpecName "kube-api-access-6rxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.189204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07639eb7-090f-4778-80fe-cc7b077d62b5" (UID: "07639eb7-090f-4778-80fe-cc7b077d62b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.193532 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config" (OuterVolumeSpecName: "config") pod "07639eb7-090f-4778-80fe-cc7b077d62b5" (UID: "07639eb7-090f-4778-80fe-cc7b077d62b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.218927 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07639eb7-090f-4778-80fe-cc7b077d62b5" (UID: "07639eb7-090f-4778-80fe-cc7b077d62b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.261268 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.262086 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wfxnw" event={"ID":"07639eb7-090f-4778-80fe-cc7b077d62b5","Type":"ContainerDied","Data":"32af449d9dea9af5d6116bf5ea10e805a84b43dd97c3f60e63f6c35937d4912a"} Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.262123 5033 scope.go:117] "RemoveContainer" containerID="e273d6a459afda743031d9863b3813ce063f270478f23c3432e84fc6b51bb8fc" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.266081 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.266105 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.266118 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07639eb7-090f-4778-80fe-cc7b077d62b5-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.266129 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rxvn\" (UniqueName: \"kubernetes.io/projected/07639eb7-090f-4778-80fe-cc7b077d62b5-kube-api-access-6rxvn\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.270638 5033 generic.go:334] "Generic (PLEG): container finished" podID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerID="8c988c140eb3a02b6ccf84af9eedffb61c75b3ffb51c419a64f861e34525755f" exitCode=0 Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.272471 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8955p" event={"ID":"e4998496-39f0-42d7-b0fa-e2caabe7ccad","Type":"ContainerDied","Data":"8c988c140eb3a02b6ccf84af9eedffb61c75b3ffb51c419a64f861e34525755f"} Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.319201 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bz7gc"] Mar 19 19:14:34 crc kubenswrapper[5033]: W0319 19:14:34.323115 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d432b8_f84d_4565_96a7_7232024ffe4b.slice/crio-d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3 WatchSource:0}: Error finding container d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3: Status 404 returned error can't find the container with id d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3 Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.361369 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.394342 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wfxnw"] Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.571129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:34 crc kubenswrapper[5033]: E0319 19:14:34.571340 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:34 crc kubenswrapper[5033]: E0319 19:14:34.571610 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:34 crc kubenswrapper[5033]: E0319 19:14:34.571659 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:14:36.57164571 +0000 UTC m=+1086.676675559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.632481 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07639eb7-090f-4778-80fe-cc7b077d62b5" path="/var/lib/kubelet/pods/07639eb7-090f-4778-80fe-cc7b077d62b5/volumes" Mar 19 19:14:34 crc kubenswrapper[5033]: I0319 19:14:34.632977 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bea9e5-2621-4992-81ad-63612a4d5460" path="/var/lib/kubelet/pods/e9bea9e5-2621-4992-81ad-63612a4d5460/volumes" Mar 19 19:14:35 crc kubenswrapper[5033]: I0319 19:14:35.298201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" event={"ID":"f56c0f10-c865-425e-b89b-d5e885c0fdca","Type":"ContainerStarted","Data":"6ff92545eafd2e4e600cd62655455ed51b6d22bb6c42368a7e1484fd827133c8"} Mar 19 19:14:35 crc kubenswrapper[5033]: I0319 19:14:35.299486 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:35 crc kubenswrapper[5033]: I0319 19:14:35.306888 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"920698c6-4c9d-4e12-bab3-9d7091f02548","Type":"ContainerStarted","Data":"393368da841e8782db137a2d3f2d5acd1ce5063e5ab0ad4a9593ef840ef8ec32"} Mar 19 19:14:35 crc kubenswrapper[5033]: I0319 19:14:35.308680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bz7gc" event={"ID":"66d432b8-f84d-4565-96a7-7232024ffe4b","Type":"ContainerStarted","Data":"d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3"} Mar 19 19:14:35 crc kubenswrapper[5033]: I0319 19:14:35.330478 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" podStartSLOduration=6.330429196 podStartE2EDuration="6.330429196s" podCreationTimestamp="2026-03-19 19:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:35.316831124 +0000 UTC m=+1085.421860973" watchObservedRunningTime="2026-03-19 19:14:35.330429196 +0000 UTC m=+1085.435459045" Mar 19 19:14:36 crc kubenswrapper[5033]: I0319 19:14:35.639787 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 19:14:36 crc kubenswrapper[5033]: I0319 19:14:35.741732 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 19:14:36 crc kubenswrapper[5033]: I0319 19:14:36.615282 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:36 crc kubenswrapper[5033]: E0319 19:14:36.615489 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:36 crc kubenswrapper[5033]: E0319 19:14:36.615707 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:36 crc kubenswrapper[5033]: E0319 19:14:36.615770 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:14:40.615753141 +0000 UTC m=+1090.720782990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.328674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"920698c6-4c9d-4e12-bab3-9d7091f02548","Type":"ContainerStarted","Data":"e3c0c7dea4804e8572cba110c1105fa1e25ffa7abbade4c7c62a68e663c8824f"} Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.328957 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.331080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99","Type":"ContainerStarted","Data":"45cfa5816b1df9a0699ab10a4c9243676b2494289c7d2cbbd929064480f4638f"} Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.333280 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8955p" event={"ID":"e4998496-39f0-42d7-b0fa-e2caabe7ccad","Type":"ContainerStarted","Data":"b8e80c1767e4282004a62d9483c41ed28cea7c1b161544b9ddd037cff26d538c"} Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.352764 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.605624166 podStartE2EDuration="7.352741606s" podCreationTimestamp="2026-03-19 19:14:30 +0000 UTC" firstStartedPulling="2026-03-19 19:14:31.136851221 +0000 UTC m=+1081.241881070" lastFinishedPulling="2026-03-19 19:14:33.883968671 +0000 UTC m=+1083.988998510" observedRunningTime="2026-03-19 19:14:37.346590973 +0000 UTC m=+1087.451620822" watchObservedRunningTime="2026-03-19 19:14:37.352741606 +0000 UTC m=+1087.457771455" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.366372 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-8955p" podStartSLOduration=6.366337188 podStartE2EDuration="6.366337188s" podCreationTimestamp="2026-03-19 19:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:37.364775224 +0000 UTC m=+1087.469805073" watchObservedRunningTime="2026-03-19 19:14:37.366337188 +0000 UTC m=+1087.471367037" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.593178 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kfsmn"] Mar 19 19:14:37 crc kubenswrapper[5033]: E0319 19:14:37.593565 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07639eb7-090f-4778-80fe-cc7b077d62b5" containerName="init" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.593576 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="07639eb7-090f-4778-80fe-cc7b077d62b5" containerName="init" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.593776 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="07639eb7-090f-4778-80fe-cc7b077d62b5" containerName="init" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.594412 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.596555 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.602300 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kfsmn"] Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.633776 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.633956 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6fx\" (UniqueName: \"kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.651142 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.651199 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.735517 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.735967 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.736655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6fx\" (UniqueName: \"kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.737012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.759472 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6fx\" (UniqueName: \"kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx\") pod \"root-account-create-update-kfsmn\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:37 crc kubenswrapper[5033]: I0319 19:14:37.970990 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:38 crc kubenswrapper[5033]: I0319 19:14:38.341361 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:38 crc kubenswrapper[5033]: I0319 19:14:38.414227 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 19:14:38 crc kubenswrapper[5033]: I0319 19:14:38.601122 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.009920 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-t65p9" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.091701 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-2pgf8" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.407226 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kd7bj"] Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.418840 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.423360 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kd7bj"] Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.473960 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.474001 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ljx7\" (UniqueName: \"kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.528333 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6546-account-create-update-dg5vw"] Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.529579 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.531737 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.536101 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6546-account-create-update-dg5vw"] Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.575601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx7gw\" (UniqueName: \"kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.575744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.575766 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ljx7\" (UniqueName: \"kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.575783 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.576659 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.606621 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ljx7\" (UniqueName: \"kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7\") pod \"glance-db-create-kd7bj\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.677356 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx7gw\" (UniqueName: \"kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.677562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.680535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.698922 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx7gw\" (UniqueName: \"kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw\") pod \"glance-6546-account-create-update-dg5vw\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.746377 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:39 crc kubenswrapper[5033]: I0319 19:14:39.850735 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.111778 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9cns8"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.113180 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.132748 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9cns8"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.163718 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.213851 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.235587 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a359-account-create-update-sdjdx"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.254519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a359-account-create-update-sdjdx"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.263752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.267043 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.287032 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfbx\" (UniqueName: \"kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.287243 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.354208 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.395053 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.395116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.395729 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.396249 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mm8\" (UniqueName: \"kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.396697 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfbx\" (UniqueName: \"kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.415535 5033 generic.go:334] "Generic (PLEG): container finished" podID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerID="4a087ee37539a6611aa3f7f2252e9f01369edf2d536cb5388780ab9ac0c88afc" exitCode=0 Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.415634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerDied","Data":"4a087ee37539a6611aa3f7f2252e9f01369edf2d536cb5388780ab9ac0c88afc"} Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.439738 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfbx\" (UniqueName: \"kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx\") pod \"keystone-db-create-9cns8\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.441261 5033 generic.go:334] "Generic (PLEG): container finished" podID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerID="a9c60b46b5a9fb57d068e917fa1f0d393fc4d26de4b8ab813a914a8467d5a93d" exitCode=0 Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.441301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerDied","Data":"a9c60b46b5a9fb57d068e917fa1f0d393fc4d26de4b8ab813a914a8467d5a93d"} Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.445531 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-f8tqn"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.447832 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.453103 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f8tqn"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.492053 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.498195 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.498306 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mm8\" (UniqueName: \"kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.500169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.520871 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-37d3-account-create-update-wsdp9"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.522154 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.523337 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mm8\" (UniqueName: \"kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8\") pod \"keystone-a359-account-create-update-sdjdx\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.525903 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.532327 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-37d3-account-create-update-wsdp9"] Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.597505 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.600573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85cd\" (UniqueName: \"kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.600623 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.702715 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgb5m\" (UniqueName: \"kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.702788 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:40 crc kubenswrapper[5033]: E0319 19:14:40.702924 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:40 crc kubenswrapper[5033]: E0319 19:14:40.702941 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:40 crc kubenswrapper[5033]: E0319 19:14:40.702987 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:14:48.702973548 +0000 UTC m=+1098.808003397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.702931 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85cd\" (UniqueName: \"kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.703134 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.703168 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.703791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.721820 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85cd\" (UniqueName: \"kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd\") pod \"placement-db-create-f8tqn\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.758404 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.758476 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.804269 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.804821 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgb5m\" (UniqueName: \"kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.804918 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.806427 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.826180 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgb5m\" (UniqueName: \"kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m\") pod \"placement-37d3-account-create-update-wsdp9\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:40 crc kubenswrapper[5033]: I0319 19:14:40.878611 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:41 crc kubenswrapper[5033]: I0319 19:14:41.311464 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 19:14:41 crc kubenswrapper[5033]: I0319 19:14:41.840827 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:14:41 crc kubenswrapper[5033]: I0319 19:14:41.921155 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:41 crc kubenswrapper[5033]: I0319 19:14:41.921511 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="dnsmasq-dns" containerID="cri-o://6ff92545eafd2e4e600cd62655455ed51b6d22bb6c42368a7e1484fd827133c8" gracePeriod=10 Mar 19 19:14:41 crc kubenswrapper[5033]: I0319 19:14:41.963632 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kfsmn"] Mar 19 19:14:42 crc kubenswrapper[5033]: W0319 19:14:42.030591 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32980c0c_66a5_43fb_ba40_b4aaa57bec40.slice/crio-96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27 WatchSource:0}: Error finding container 96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27: Status 404 returned error can't find the container with id 96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27 Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.465386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bz7gc" event={"ID":"66d432b8-f84d-4565-96a7-7232024ffe4b","Type":"ContainerStarted","Data":"98d740cfe6231cb81a7eec3c6e3fbfcac96d6066ff73c6b543c106bc3eb0822c"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.472238 5033 generic.go:334] "Generic (PLEG): container finished" podID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerID="6ff92545eafd2e4e600cd62655455ed51b6d22bb6c42368a7e1484fd827133c8" exitCode=0 Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.472361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" event={"ID":"f56c0f10-c865-425e-b89b-d5e885c0fdca","Type":"ContainerDied","Data":"6ff92545eafd2e4e600cd62655455ed51b6d22bb6c42368a7e1484fd827133c8"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.483195 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kd7bj"] Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.484761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerStarted","Data":"1a36b31f845e1c0781461b231449786077a087754d63d07145620aa63ac84c39"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.485279 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.487403 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kfsmn" event={"ID":"32980c0c-66a5-43fb-ba40-b4aaa57bec40","Type":"ContainerStarted","Data":"675948172a31265004e1481053dad14f758e5626aad4d5037b068fef23e7507d"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.487427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kfsmn" event={"ID":"32980c0c-66a5-43fb-ba40-b4aaa57bec40","Type":"ContainerStarted","Data":"96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.489410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kd7bj" event={"ID":"9143c6bb-985c-4b14-abf1-1813e54136cc","Type":"ContainerStarted","Data":"4e6fd099b5bcc812520a2285634ba5a9cfda203f8a649eb65cc10ad748449fa4"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.491713 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerStarted","Data":"d37758e1bf5a656491966c2f217d34d31f9f7ab73b8d88a435ebe4b1c85983d3"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.506956 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerStarted","Data":"41fb3582d1232a277241178fe8b3c077cce7be9055ded5f3e80f154971416e95"} Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.507351 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.532731 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bz7gc" podStartSLOduration=2.060262983 podStartE2EDuration="9.532694485s" podCreationTimestamp="2026-03-19 19:14:33 +0000 UTC" firstStartedPulling="2026-03-19 19:14:34.327421145 +0000 UTC m=+1084.432450994" lastFinishedPulling="2026-03-19 19:14:41.799852647 +0000 UTC m=+1091.904882496" observedRunningTime="2026-03-19 19:14:42.496386454 +0000 UTC m=+1092.601416303" watchObservedRunningTime="2026-03-19 19:14:42.532694485 +0000 UTC m=+1092.637724334" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.574505 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.377728694 podStartE2EDuration="57.574478739s" podCreationTimestamp="2026-03-19 19:13:45 +0000 UTC" firstStartedPulling="2026-03-19 19:13:54.370835027 +0000 UTC m=+1044.475864876" lastFinishedPulling="2026-03-19 19:14:06.567585072 +0000 UTC m=+1056.672614921" observedRunningTime="2026-03-19 19:14:42.541849372 +0000 UTC m=+1092.646879231" watchObservedRunningTime="2026-03-19 19:14:42.574478739 +0000 UTC m=+1092.679508608" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.597098 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-kfsmn" podStartSLOduration=5.597069634 podStartE2EDuration="5.597069634s" podCreationTimestamp="2026-03-19 19:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:42.562970606 +0000 UTC m=+1092.668000455" watchObservedRunningTime="2026-03-19 19:14:42.597069634 +0000 UTC m=+1092.702099483" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.610094 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.137607287 podStartE2EDuration="58.605559233s" podCreationTimestamp="2026-03-19 19:13:44 +0000 UTC" firstStartedPulling="2026-03-19 19:13:47.045663709 +0000 UTC m=+1037.150693558" lastFinishedPulling="2026-03-19 19:14:06.513615655 +0000 UTC m=+1056.618645504" observedRunningTime="2026-03-19 19:14:42.592941628 +0000 UTC m=+1092.697971477" watchObservedRunningTime="2026-03-19 19:14:42.605559233 +0000 UTC m=+1092.710589082" Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.666691 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-37d3-account-create-update-wsdp9"] Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.715516 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6546-account-create-update-dg5vw"] Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.947252 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-f8tqn"] Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.953516 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a359-account-create-update-sdjdx"] Mar 19 19:14:42 crc kubenswrapper[5033]: I0319 19:14:42.966755 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9cns8"] Mar 19 19:14:43 crc kubenswrapper[5033]: W0319 19:14:43.087879 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9308b09f_03b2_4866_a458_26c8de752ae1.slice/crio-462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70 WatchSource:0}: Error finding container 462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70: Status 404 returned error can't find the container with id 462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70 Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.516424 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" event={"ID":"f56c0f10-c865-425e-b89b-d5e885c0fdca","Type":"ContainerDied","Data":"3e8f1f3562ea355159ac0071b4f741b5eb8c55d2b677adbce88a1d2cd1196411"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.516480 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e8f1f3562ea355159ac0071b4f741b5eb8c55d2b677adbce88a1d2cd1196411" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.518622 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a359-account-create-update-sdjdx" event={"ID":"5ffbb960-0f7e-4a03-9796-db1e3073d08e","Type":"ContainerStarted","Data":"6620863a46484573b062e41a33aa47cc7928d93e7df757aee63d3a2ead789baa"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.522907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37d3-account-create-update-wsdp9" event={"ID":"755b22ef-8257-4051-b51f-88ad1249cf11","Type":"ContainerStarted","Data":"2b9df12bf4542ce381fe0c7fbc0a1e2ff167ef7f476973f4518b1face0a59830"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.524752 5033 generic.go:334] "Generic (PLEG): container finished" podID="32980c0c-66a5-43fb-ba40-b4aaa57bec40" containerID="675948172a31265004e1481053dad14f758e5626aad4d5037b068fef23e7507d" exitCode=0 Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.524811 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kfsmn" event={"ID":"32980c0c-66a5-43fb-ba40-b4aaa57bec40","Type":"ContainerDied","Data":"675948172a31265004e1481053dad14f758e5626aad4d5037b068fef23e7507d"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.526250 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f8tqn" event={"ID":"9308b09f-03b2-4866-a458-26c8de752ae1","Type":"ContainerStarted","Data":"462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.527626 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6546-account-create-update-dg5vw" event={"ID":"630d3b9e-00d5-4627-901b-958ff5a2aca6","Type":"ContainerStarted","Data":"b6fac8310b91ef7762c509ece733a0f68b5a57adc2d6b12ddf42f00348189686"} Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.539479 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.667581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkb6\" (UniqueName: \"kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6\") pod \"f56c0f10-c865-425e-b89b-d5e885c0fdca\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.668163 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config\") pod \"f56c0f10-c865-425e-b89b-d5e885c0fdca\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.668207 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb\") pod \"f56c0f10-c865-425e-b89b-d5e885c0fdca\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.668237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb\") pod \"f56c0f10-c865-425e-b89b-d5e885c0fdca\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.668277 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc\") pod \"f56c0f10-c865-425e-b89b-d5e885c0fdca\" (UID: \"f56c0f10-c865-425e-b89b-d5e885c0fdca\") " Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.682475 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6" (OuterVolumeSpecName: "kube-api-access-9nkb6") pod "f56c0f10-c865-425e-b89b-d5e885c0fdca" (UID: "f56c0f10-c865-425e-b89b-d5e885c0fdca"). InnerVolumeSpecName "kube-api-access-9nkb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.771538 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkb6\" (UniqueName: \"kubernetes.io/projected/f56c0f10-c865-425e-b89b-d5e885c0fdca-kube-api-access-9nkb6\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.821696 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config" (OuterVolumeSpecName: "config") pod "f56c0f10-c865-425e-b89b-d5e885c0fdca" (UID: "f56c0f10-c865-425e-b89b-d5e885c0fdca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.873243 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.918986 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f56c0f10-c865-425e-b89b-d5e885c0fdca" (UID: "f56c0f10-c865-425e-b89b-d5e885c0fdca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:43 crc kubenswrapper[5033]: I0319 19:14:43.974788 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.016381 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f56c0f10-c865-425e-b89b-d5e885c0fdca" (UID: "f56c0f10-c865-425e-b89b-d5e885c0fdca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.076250 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.546151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cns8" event={"ID":"c8b2953d-cccb-4334-9be9-a8a884cfb7a1","Type":"ContainerStarted","Data":"ee19acbfe70b1c87f13991876f338bf2bdd777fc62deead4eb9c41cda9864526"} Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.546562 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cns8" event={"ID":"c8b2953d-cccb-4334-9be9-a8a884cfb7a1","Type":"ContainerStarted","Data":"c830698f96f8fa3ec3d76b780dc4c744dd68c2f7584de87e9658072950b9163e"} Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.548142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f8tqn" event={"ID":"9308b09f-03b2-4866-a458-26c8de752ae1","Type":"ContainerStarted","Data":"ecc61e2309608f8eaa203a5546470634e17324237a7300753f20c38db7d2affe"} Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.550832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kd7bj" event={"ID":"9143c6bb-985c-4b14-abf1-1813e54136cc","Type":"ContainerStarted","Data":"4ad83655ca16a46b659927a74732ee5d396c77654ee09126f44adaee4c46ae5f"} Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.551014 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8s55v" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.567268 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-9cns8" podStartSLOduration=4.567251828 podStartE2EDuration="4.567251828s" podCreationTimestamp="2026-03-19 19:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:44.56446228 +0000 UTC m=+1094.669492129" watchObservedRunningTime="2026-03-19 19:14:44.567251828 +0000 UTC m=+1094.672281677" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.581048 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-kd7bj" podStartSLOduration=5.581025565 podStartE2EDuration="5.581025565s" podCreationTimestamp="2026-03-19 19:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:44.579613905 +0000 UTC m=+1094.684643764" watchObservedRunningTime="2026-03-19 19:14:44.581025565 +0000 UTC m=+1094.686055414" Mar 19 19:14:44 crc kubenswrapper[5033]: I0319 19:14:44.607414 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-f8tqn" podStartSLOduration=4.607396026 podStartE2EDuration="4.607396026s" podCreationTimestamp="2026-03-19 19:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:14:44.603311122 +0000 UTC m=+1094.708340971" watchObservedRunningTime="2026-03-19 19:14:44.607396026 +0000 UTC m=+1094.712425875" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.093594 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f56c0f10-c865-425e-b89b-d5e885c0fdca" (UID: "f56c0f10-c865-425e-b89b-d5e885c0fdca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.111322 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f56c0f10-c865-425e-b89b-d5e885c0fdca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:45 crc kubenswrapper[5033]: E0319 19:14:45.380096 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755b22ef_8257_4051_b51f_88ad1249cf11.slice/crio-conmon-b64c4ff3d3e6593afa889465ec3697ca7df32e286b935c4ff9b1fff18ce7d9f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630d3b9e_00d5_4627_901b_958ff5a2aca6.slice/crio-conmon-0ce7e5743025c4198fea15cc250e53094640f6228eb009af102dcc0084740c55.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755b22ef_8257_4051_b51f_88ad1249cf11.slice/crio-b64c4ff3d3e6593afa889465ec3697ca7df32e286b935c4ff9b1fff18ce7d9f6.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.563210 5033 generic.go:334] "Generic (PLEG): container finished" podID="9143c6bb-985c-4b14-abf1-1813e54136cc" containerID="4ad83655ca16a46b659927a74732ee5d396c77654ee09126f44adaee4c46ae5f" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.565173 5033 generic.go:334] "Generic (PLEG): container finished" podID="c8b2953d-cccb-4334-9be9-a8a884cfb7a1" containerID="ee19acbfe70b1c87f13991876f338bf2bdd777fc62deead4eb9c41cda9864526" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.566733 5033 generic.go:334] "Generic (PLEG): container finished" podID="9308b09f-03b2-4866-a458-26c8de752ae1" containerID="ecc61e2309608f8eaa203a5546470634e17324237a7300753f20c38db7d2affe" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.568429 5033 generic.go:334] "Generic (PLEG): container finished" podID="630d3b9e-00d5-4627-901b-958ff5a2aca6" containerID="0ce7e5743025c4198fea15cc250e53094640f6228eb009af102dcc0084740c55" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.576635 5033 generic.go:334] "Generic (PLEG): container finished" podID="5ffbb960-0f7e-4a03-9796-db1e3073d08e" containerID="b094d9d30f70d26d4b984c7ba5580d9211d15d98e225604356a9421169565fb7" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.578803 5033 generic.go:334] "Generic (PLEG): container finished" podID="755b22ef-8257-4051-b51f-88ad1249cf11" containerID="b64c4ff3d3e6593afa889465ec3697ca7df32e286b935c4ff9b1fff18ce7d9f6" exitCode=0 Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.578883 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.578907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kfsmn" event={"ID":"32980c0c-66a5-43fb-ba40-b4aaa57bec40","Type":"ContainerDied","Data":"96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.578941 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e57b36b00359babdfce72fdf9b02f15cd6e469ff9faf6c9f5b2978ae4a3b27" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579003 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579030 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kd7bj" event={"ID":"9143c6bb-985c-4b14-abf1-1813e54136cc","Type":"ContainerDied","Data":"4ad83655ca16a46b659927a74732ee5d396c77654ee09126f44adaee4c46ae5f"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579057 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cns8" event={"ID":"c8b2953d-cccb-4334-9be9-a8a884cfb7a1","Type":"ContainerDied","Data":"ee19acbfe70b1c87f13991876f338bf2bdd777fc62deead4eb9c41cda9864526"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579082 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f8tqn" event={"ID":"9308b09f-03b2-4866-a458-26c8de752ae1","Type":"ContainerDied","Data":"ecc61e2309608f8eaa203a5546470634e17324237a7300753f20c38db7d2affe"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6546-account-create-update-dg5vw" event={"ID":"630d3b9e-00d5-4627-901b-958ff5a2aca6","Type":"ContainerDied","Data":"0ce7e5743025c4198fea15cc250e53094640f6228eb009af102dcc0084740c55"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579121 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"1b5fab5b-14ba-4b0a-adb3-f4bad7edac99","Type":"ContainerStarted","Data":"0a88cbdc09d0901ce65954df015c1c812097652c2f302733dbebc493e9e7fe2e"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a359-account-create-update-sdjdx" event={"ID":"5ffbb960-0f7e-4a03-9796-db1e3073d08e","Type":"ContainerDied","Data":"b094d9d30f70d26d4b984c7ba5580d9211d15d98e225604356a9421169565fb7"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.579165 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37d3-account-create-update-wsdp9" event={"ID":"755b22ef-8257-4051-b51f-88ad1249cf11","Type":"ContainerDied","Data":"b64c4ff3d3e6593afa889465ec3697ca7df32e286b935c4ff9b1fff18ce7d9f6"} Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.597155 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.650445 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=24.547513764 podStartE2EDuration="54.650422902s" podCreationTimestamp="2026-03-19 19:13:51 +0000 UTC" firstStartedPulling="2026-03-19 19:14:06.377263693 +0000 UTC m=+1056.482293542" lastFinishedPulling="2026-03-19 19:14:36.480172831 +0000 UTC m=+1086.585202680" observedRunningTime="2026-03-19 19:14:45.646155692 +0000 UTC m=+1095.751185571" watchObservedRunningTime="2026-03-19 19:14:45.650422902 +0000 UTC m=+1095.755452771" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.721424 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6fx\" (UniqueName: \"kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx\") pod \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.721616 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts\") pod \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\" (UID: \"32980c0c-66a5-43fb-ba40-b4aaa57bec40\") " Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.722756 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32980c0c-66a5-43fb-ba40-b4aaa57bec40" (UID: "32980c0c-66a5-43fb-ba40-b4aaa57bec40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.742833 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx" (OuterVolumeSpecName: "kube-api-access-7n6fx") pod "32980c0c-66a5-43fb-ba40-b4aaa57bec40" (UID: "32980c0c-66a5-43fb-ba40-b4aaa57bec40"). InnerVolumeSpecName "kube-api-access-7n6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.786955 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.797286 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8s55v"] Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.839636 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n6fx\" (UniqueName: \"kubernetes.io/projected/32980c0c-66a5-43fb-ba40-b4aaa57bec40-kube-api-access-7n6fx\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:45 crc kubenswrapper[5033]: I0319 19:14:45.839665 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32980c0c-66a5-43fb-ba40-b4aaa57bec40-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:46 crc kubenswrapper[5033]: I0319 19:14:46.591738 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kfsmn" Mar 19 19:14:46 crc kubenswrapper[5033]: I0319 19:14:46.591842 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerStarted","Data":"0ad19061be5c068775ac45b1d7b3a54c96ea4c32f9213b208bdd5f0b117543d9"} Mar 19 19:14:46 crc kubenswrapper[5033]: I0319 19:14:46.635200 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" path="/var/lib/kubelet/pods/f56c0f10-c865-425e-b89b-d5e885c0fdca/volumes" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.023390 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.066598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts\") pod \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.066747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2mm8\" (UniqueName: \"kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8\") pod \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\" (UID: \"5ffbb960-0f7e-4a03-9796-db1e3073d08e\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.068367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ffbb960-0f7e-4a03-9796-db1e3073d08e" (UID: "5ffbb960-0f7e-4a03-9796-db1e3073d08e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.073002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8" (OuterVolumeSpecName: "kube-api-access-r2mm8") pod "5ffbb960-0f7e-4a03-9796-db1e3073d08e" (UID: "5ffbb960-0f7e-4a03-9796-db1e3073d08e"). InnerVolumeSpecName "kube-api-access-r2mm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.168551 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffbb960-0f7e-4a03-9796-db1e3073d08e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.168929 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2mm8\" (UniqueName: \"kubernetes.io/projected/5ffbb960-0f7e-4a03-9796-db1e3073d08e-kube-api-access-r2mm8\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.275083 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.282023 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.294758 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.317089 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.335639 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.371941 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85cd\" (UniqueName: \"kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd\") pod \"9308b09f-03b2-4866-a458-26c8de752ae1\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372086 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts\") pod \"630d3b9e-00d5-4627-901b-958ff5a2aca6\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372117 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts\") pod \"9308b09f-03b2-4866-a458-26c8de752ae1\" (UID: \"9308b09f-03b2-4866-a458-26c8de752ae1\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372155 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts\") pod \"755b22ef-8257-4051-b51f-88ad1249cf11\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372202 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgb5m\" (UniqueName: \"kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m\") pod \"755b22ef-8257-4051-b51f-88ad1249cf11\" (UID: \"755b22ef-8257-4051-b51f-88ad1249cf11\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372356 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts\") pod \"9143c6bb-985c-4b14-abf1-1813e54136cc\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372385 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ljx7\" (UniqueName: \"kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7\") pod \"9143c6bb-985c-4b14-abf1-1813e54136cc\" (UID: \"9143c6bb-985c-4b14-abf1-1813e54136cc\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts\") pod \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372474 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmfbx\" (UniqueName: \"kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx\") pod \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\" (UID: \"c8b2953d-cccb-4334-9be9-a8a884cfb7a1\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.372492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx7gw\" (UniqueName: \"kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw\") pod \"630d3b9e-00d5-4627-901b-958ff5a2aca6\" (UID: \"630d3b9e-00d5-4627-901b-958ff5a2aca6\") " Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.373055 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "755b22ef-8257-4051-b51f-88ad1249cf11" (UID: "755b22ef-8257-4051-b51f-88ad1249cf11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.373134 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9143c6bb-985c-4b14-abf1-1813e54136cc" (UID: "9143c6bb-985c-4b14-abf1-1813e54136cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.373946 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9308b09f-03b2-4866-a458-26c8de752ae1" (UID: "9308b09f-03b2-4866-a458-26c8de752ae1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.373969 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "630d3b9e-00d5-4627-901b-958ff5a2aca6" (UID: "630d3b9e-00d5-4627-901b-958ff5a2aca6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8b2953d-cccb-4334-9be9-a8a884cfb7a1" (UID: "c8b2953d-cccb-4334-9be9-a8a884cfb7a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376753 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630d3b9e-00d5-4627-901b-958ff5a2aca6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376769 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9308b09f-03b2-4866-a458-26c8de752ae1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376777 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755b22ef-8257-4051-b51f-88ad1249cf11-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376785 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9143c6bb-985c-4b14-abf1-1813e54136cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.376794 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.377127 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m" (OuterVolumeSpecName: "kube-api-access-lgb5m") pod "755b22ef-8257-4051-b51f-88ad1249cf11" (UID: "755b22ef-8257-4051-b51f-88ad1249cf11"). InnerVolumeSpecName "kube-api-access-lgb5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.379254 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7" (OuterVolumeSpecName: "kube-api-access-4ljx7") pod "9143c6bb-985c-4b14-abf1-1813e54136cc" (UID: "9143c6bb-985c-4b14-abf1-1813e54136cc"). InnerVolumeSpecName "kube-api-access-4ljx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.379684 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx" (OuterVolumeSpecName: "kube-api-access-dmfbx") pod "c8b2953d-cccb-4334-9be9-a8a884cfb7a1" (UID: "c8b2953d-cccb-4334-9be9-a8a884cfb7a1"). InnerVolumeSpecName "kube-api-access-dmfbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.380643 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd" (OuterVolumeSpecName: "kube-api-access-c85cd") pod "9308b09f-03b2-4866-a458-26c8de752ae1" (UID: "9308b09f-03b2-4866-a458-26c8de752ae1"). InnerVolumeSpecName "kube-api-access-c85cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.381199 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw" (OuterVolumeSpecName: "kube-api-access-bx7gw") pod "630d3b9e-00d5-4627-901b-958ff5a2aca6" (UID: "630d3b9e-00d5-4627-901b-958ff5a2aca6"). InnerVolumeSpecName "kube-api-access-bx7gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.479051 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ljx7\" (UniqueName: \"kubernetes.io/projected/9143c6bb-985c-4b14-abf1-1813e54136cc-kube-api-access-4ljx7\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.479101 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmfbx\" (UniqueName: \"kubernetes.io/projected/c8b2953d-cccb-4334-9be9-a8a884cfb7a1-kube-api-access-dmfbx\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.479118 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx7gw\" (UniqueName: \"kubernetes.io/projected/630d3b9e-00d5-4627-901b-958ff5a2aca6-kube-api-access-bx7gw\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.479132 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c85cd\" (UniqueName: \"kubernetes.io/projected/9308b09f-03b2-4866-a458-26c8de752ae1-kube-api-access-c85cd\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.479143 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgb5m\" (UniqueName: \"kubernetes.io/projected/755b22ef-8257-4051-b51f-88ad1249cf11-kube-api-access-lgb5m\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.603748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kd7bj" event={"ID":"9143c6bb-985c-4b14-abf1-1813e54136cc","Type":"ContainerDied","Data":"4e6fd099b5bcc812520a2285634ba5a9cfda203f8a649eb65cc10ad748449fa4"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.603784 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6fd099b5bcc812520a2285634ba5a9cfda203f8a649eb65cc10ad748449fa4" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.603804 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kd7bj" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.610723 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9cns8" event={"ID":"c8b2953d-cccb-4334-9be9-a8a884cfb7a1","Type":"ContainerDied","Data":"c830698f96f8fa3ec3d76b780dc4c744dd68c2f7584de87e9658072950b9163e"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.610776 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c830698f96f8fa3ec3d76b780dc4c744dd68c2f7584de87e9658072950b9163e" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.610736 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9cns8" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.626086 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-f8tqn" event={"ID":"9308b09f-03b2-4866-a458-26c8de752ae1","Type":"ContainerDied","Data":"462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.626126 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462fc98bd13ca070f8c65f47a3647899348ccfdd358d7cd07debd563784afb70" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.626194 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-f8tqn" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.634855 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6546-account-create-update-dg5vw" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.634830 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6546-account-create-update-dg5vw" event={"ID":"630d3b9e-00d5-4627-901b-958ff5a2aca6","Type":"ContainerDied","Data":"b6fac8310b91ef7762c509ece733a0f68b5a57adc2d6b12ddf42f00348189686"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.635046 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6fac8310b91ef7762c509ece733a0f68b5a57adc2d6b12ddf42f00348189686" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.637972 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a359-account-create-update-sdjdx" event={"ID":"5ffbb960-0f7e-4a03-9796-db1e3073d08e","Type":"ContainerDied","Data":"6620863a46484573b062e41a33aa47cc7928d93e7df757aee63d3a2ead789baa"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.638014 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6620863a46484573b062e41a33aa47cc7928d93e7df757aee63d3a2ead789baa" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.637987 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a359-account-create-update-sdjdx" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.643107 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-37d3-account-create-update-wsdp9" Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.643105 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-37d3-account-create-update-wsdp9" event={"ID":"755b22ef-8257-4051-b51f-88ad1249cf11","Type":"ContainerDied","Data":"2b9df12bf4542ce381fe0c7fbc0a1e2ff167ef7f476973f4518b1face0a59830"} Mar 19 19:14:47 crc kubenswrapper[5033]: I0319 19:14:47.643150 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b9df12bf4542ce381fe0c7fbc0a1e2ff167ef7f476973f4518b1face0a59830" Mar 19 19:14:48 crc kubenswrapper[5033]: I0319 19:14:48.801518 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:14:48 crc kubenswrapper[5033]: E0319 19:14:48.802005 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:14:48 crc kubenswrapper[5033]: E0319 19:14:48.802072 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:14:48 crc kubenswrapper[5033]: E0319 19:14:48.802176 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift podName:a91fda80-4324-4015-a32f-3396d6d2da1d nodeName:}" failed. No retries permitted until 2026-03-19 19:15:04.802144805 +0000 UTC m=+1114.907174654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift") pod "swift-storage-0" (UID: "a91fda80-4324-4015-a32f-3396d6d2da1d") : configmap "swift-ring-files" not found Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.661637 5033 generic.go:334] "Generic (PLEG): container finished" podID="66d432b8-f84d-4565-96a7-7232024ffe4b" containerID="98d740cfe6231cb81a7eec3c6e3fbfcac96d6066ff73c6b543c106bc3eb0822c" exitCode=0 Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.661702 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bz7gc" event={"ID":"66d432b8-f84d-4565-96a7-7232024ffe4b","Type":"ContainerDied","Data":"98d740cfe6231cb81a7eec3c6e3fbfcac96d6066ff73c6b543c106bc3eb0822c"} Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.726079 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mj7zf"] Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727266 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffbb960-0f7e-4a03-9796-db1e3073d08e" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727288 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffbb960-0f7e-4a03-9796-db1e3073d08e" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727312 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9143c6bb-985c-4b14-abf1-1813e54136cc" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727320 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9143c6bb-985c-4b14-abf1-1813e54136cc" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727340 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755b22ef-8257-4051-b51f-88ad1249cf11" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727360 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="755b22ef-8257-4051-b51f-88ad1249cf11" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727378 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630d3b9e-00d5-4627-901b-958ff5a2aca6" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727385 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="630d3b9e-00d5-4627-901b-958ff5a2aca6" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727404 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="dnsmasq-dns" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727410 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="dnsmasq-dns" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727429 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9308b09f-03b2-4866-a458-26c8de752ae1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727439 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9308b09f-03b2-4866-a458-26c8de752ae1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727494 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b2953d-cccb-4334-9be9-a8a884cfb7a1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727501 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b2953d-cccb-4334-9be9-a8a884cfb7a1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727514 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32980c0c-66a5-43fb-ba40-b4aaa57bec40" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727529 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="32980c0c-66a5-43fb-ba40-b4aaa57bec40" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: E0319 19:14:49.727556 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="init" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.727562 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="init" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728123 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="630d3b9e-00d5-4627-901b-958ff5a2aca6" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728142 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9143c6bb-985c-4b14-abf1-1813e54136cc" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728162 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9308b09f-03b2-4866-a458-26c8de752ae1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728181 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b2953d-cccb-4334-9be9-a8a884cfb7a1" containerName="mariadb-database-create" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728192 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="755b22ef-8257-4051-b51f-88ad1249cf11" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728214 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffbb960-0f7e-4a03-9796-db1e3073d08e" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728231 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="32980c0c-66a5-43fb-ba40-b4aaa57bec40" containerName="mariadb-account-create-update" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.728242 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56c0f10-c865-425e-b89b-d5e885c0fdca" containerName="dnsmasq-dns" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.731887 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.739197 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xnfth" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.739439 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.751540 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mj7zf"] Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.823909 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.824053 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.824084 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.824168 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.878874 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.926111 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.926184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.926345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.926423 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.932881 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.933605 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.933741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:49 crc kubenswrapper[5033]: I0319 19:14:49.957441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9\") pod \"glance-db-sync-mj7zf\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:50 crc kubenswrapper[5033]: I0319 19:14:50.053889 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mj7zf" Mar 19 19:14:50 crc kubenswrapper[5033]: I0319 19:14:50.683802 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerStarted","Data":"d043f8652fccf13947db9dfaf6ba603dd3045334dfdccda82a537495345d17f6"} Mar 19 19:14:50 crc kubenswrapper[5033]: I0319 19:14:50.689633 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 19:14:50 crc kubenswrapper[5033]: I0319 19:14:50.726317 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.230327369 podStartE2EDuration="59.726299676s" podCreationTimestamp="2026-03-19 19:13:51 +0000 UTC" firstStartedPulling="2026-03-19 19:14:06.416700961 +0000 UTC m=+1056.521730810" lastFinishedPulling="2026-03-19 19:14:49.912673278 +0000 UTC m=+1100.017703117" observedRunningTime="2026-03-19 19:14:50.719071333 +0000 UTC m=+1100.824101182" watchObservedRunningTime="2026-03-19 19:14:50.726299676 +0000 UTC m=+1100.831329525" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.242968 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mj7zf"] Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.266575 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kfsmn"] Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.273063 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kfsmn"] Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.278522 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462699 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462798 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462857 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462911 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.462986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jlqh\" (UniqueName: \"kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.463124 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf\") pod \"66d432b8-f84d-4565-96a7-7232024ffe4b\" (UID: \"66d432b8-f84d-4565-96a7-7232024ffe4b\") " Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.463517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.463603 5033 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.463864 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.470222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh" (OuterVolumeSpecName: "kube-api-access-7jlqh") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "kube-api-access-7jlqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.474277 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.490381 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts" (OuterVolumeSpecName: "scripts") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.491587 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.496237 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66d432b8-f84d-4565-96a7-7232024ffe4b" (UID: "66d432b8-f84d-4565-96a7-7232024ffe4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565463 5033 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565507 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d432b8-f84d-4565-96a7-7232024ffe4b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565518 5033 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565531 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d432b8-f84d-4565-96a7-7232024ffe4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565544 5033 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/66d432b8-f84d-4565-96a7-7232024ffe4b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.565556 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jlqh\" (UniqueName: \"kubernetes.io/projected/66d432b8-f84d-4565-96a7-7232024ffe4b-kube-api-access-7jlqh\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.693629 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bz7gc" event={"ID":"66d432b8-f84d-4565-96a7-7232024ffe4b","Type":"ContainerDied","Data":"d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3"} Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.693844 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d142903432ba53e299344502f06995f27e743c5a282177c62da4a29161db38d3" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.693682 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bz7gc" Mar 19 19:14:51 crc kubenswrapper[5033]: I0319 19:14:51.694789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mj7zf" event={"ID":"8112e00f-afca-47a3-b233-b8282cccf396","Type":"ContainerStarted","Data":"a63c135a757863b2f14a9f068393579deee95d9aaa899f3730840e42214875eb"} Mar 19 19:14:52 crc kubenswrapper[5033]: I0319 19:14:52.655684 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32980c0c-66a5-43fb-ba40-b4aaa57bec40" path="/var/lib/kubelet/pods/32980c0c-66a5-43fb-ba40-b4aaa57bec40/volumes" Mar 19 19:14:52 crc kubenswrapper[5033]: I0319 19:14:52.885332 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 19:14:52 crc kubenswrapper[5033]: I0319 19:14:52.885399 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 19:14:52 crc kubenswrapper[5033]: I0319 19:14:52.888785 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 19:14:53 crc kubenswrapper[5033]: I0319 19:14:53.711315 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.508186 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8z6ts" podUID="ebcc8953-fc35-48d7-a3fd-be1a2291c08c" containerName="ovn-controller" probeResult="failure" output=< Mar 19 19:14:55 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 19:14:55 crc kubenswrapper[5033]: > Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.525366 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.533247 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d4vfd" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.757314 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8z6ts-config-8nqqm"] Mar 19 19:14:55 crc kubenswrapper[5033]: E0319 19:14:55.762091 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d432b8-f84d-4565-96a7-7232024ffe4b" containerName="swift-ring-rebalance" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.762124 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d432b8-f84d-4565-96a7-7232024ffe4b" containerName="swift-ring-rebalance" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.762364 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d432b8-f84d-4565-96a7-7232024ffe4b" containerName="swift-ring-rebalance" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.763180 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.765967 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts-config-8nqqm"] Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.767474 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843440 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843509 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843624 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843688 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbg49\" (UniqueName: \"kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.843734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.944830 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbg49\" (UniqueName: \"kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.944904 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.944941 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.944968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.945042 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.945084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.945397 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.945474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.945506 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.946749 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.947092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:55 crc kubenswrapper[5033]: I0319 19:14:55.966053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbg49\" (UniqueName: \"kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49\") pod \"ovn-controller-8z6ts-config-8nqqm\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.086565 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.239787 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.240123 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" containerID="cri-o://d37758e1bf5a656491966c2f217d34d31f9f7ab73b8d88a435ebe4b1c85983d3" gracePeriod=600 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.240567 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="thanos-sidecar" containerID="cri-o://d043f8652fccf13947db9dfaf6ba603dd3045334dfdccda82a537495345d17f6" gracePeriod=600 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.240648 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="config-reloader" containerID="cri-o://0ad19061be5c068775ac45b1d7b3a54c96ea4c32f9213b208bdd5f0b117543d9" gracePeriod=600 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.378835 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-v74jb"] Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.380969 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.384749 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.389768 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v74jb"] Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.452858 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l76db\" (UniqueName: \"kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.453014 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.475641 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.504611 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.554626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l76db\" (UniqueName: \"kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.554934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.556771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.599993 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l76db\" (UniqueName: \"kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db\") pod \"root-account-create-update-v74jb\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750017 5033 generic.go:334] "Generic (PLEG): container finished" podID="ffd2aa46-7091-4165-9da4-248c04907907" containerID="d043f8652fccf13947db9dfaf6ba603dd3045334dfdccda82a537495345d17f6" exitCode=0 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750044 5033 generic.go:334] "Generic (PLEG): container finished" podID="ffd2aa46-7091-4165-9da4-248c04907907" containerID="0ad19061be5c068775ac45b1d7b3a54c96ea4c32f9213b208bdd5f0b117543d9" exitCode=0 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750086 5033 generic.go:334] "Generic (PLEG): container finished" podID="ffd2aa46-7091-4165-9da4-248c04907907" containerID="d37758e1bf5a656491966c2f217d34d31f9f7ab73b8d88a435ebe4b1c85983d3" exitCode=0 Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750105 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerDied","Data":"d043f8652fccf13947db9dfaf6ba603dd3045334dfdccda82a537495345d17f6"} Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerDied","Data":"0ad19061be5c068775ac45b1d7b3a54c96ea4c32f9213b208bdd5f0b117543d9"} Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.750138 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerDied","Data":"d37758e1bf5a656491966c2f217d34d31f9f7ab73b8d88a435ebe4b1c85983d3"} Mar 19 19:14:56 crc kubenswrapper[5033]: I0319 19:14:56.757957 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v74jb" Mar 19 19:14:57 crc kubenswrapper[5033]: I0319 19:14:57.879902 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": dial tcp 10.217.0.116:9090: connect: connection refused" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.173768 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6fqmp"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.174974 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.202550 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6fqmp"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.284090 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqqg9\" (UniqueName: \"kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.284270 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.292070 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6fd1-account-create-update-pc8fs"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.293184 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.295360 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.304662 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6fd1-account-create-update-pc8fs"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.388176 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-xnstv"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.389528 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.391398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.391465 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqqg9\" (UniqueName: \"kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.391557 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.391584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ctw2\" (UniqueName: \"kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.398984 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.401331 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xnstv"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.431982 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqqg9\" (UniqueName: \"kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9\") pod \"cinder-db-create-6fqmp\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.474510 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qc584"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.475916 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.483496 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6e4d-account-create-update-bccmp"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.485237 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.489397 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.492720 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.492871 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.492998 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qptzc\" (UniqueName: \"kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.493099 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ctw2\" (UniqueName: \"kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.493974 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.501441 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fqmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.503207 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qc584"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.533016 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e4d-account-create-update-bccmp"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.564404 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ctw2\" (UniqueName: \"kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2\") pod \"cinder-6fd1-account-create-update-pc8fs\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.587340 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kpzv2"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.588584 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.592204 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.592410 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tm5hs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.594802 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.594851 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.594889 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkch\" (UniqueName: \"kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.594951 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qptzc\" (UniqueName: \"kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6q5m\" (UniqueName: \"kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595160 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595554 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595737 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595844 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.595904 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kpzv2"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.611064 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.621064 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qptzc\" (UniqueName: \"kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc\") pod \"cloudkitty-db-create-xnstv\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696517 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696568 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkch\" (UniqueName: \"kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6q5m\" (UniqueName: \"kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696689 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696746 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.696763 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rggz2\" (UniqueName: \"kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.697484 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.697972 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.699479 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-292e-account-create-update-vzxnc"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.700553 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.702736 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.713554 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-j9qgq"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.714783 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.718370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6q5m\" (UniqueName: \"kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m\") pod \"barbican-db-create-qc584\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.719918 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkch\" (UniqueName: \"kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch\") pod \"barbican-6e4d-account-create-update-bccmp\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.738271 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-292e-account-create-update-vzxnc"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.755149 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j9qgq"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799201 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799343 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxz5\" (UniqueName: \"kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799383 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799690 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799722 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rggz2\" (UniqueName: \"kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.799830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvjp\" (UniqueName: \"kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.805629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.805652 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.811321 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qc584" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.812689 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5391-account-create-update-wd7jh"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.813814 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.820607 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.820755 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.839304 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rggz2\" (UniqueName: \"kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2\") pod \"keystone-db-sync-kpzv2\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.867501 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5391-account-create-update-wd7jh"] Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901154 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxz5\" (UniqueName: \"kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901227 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901252 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901362 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvjp\" (UniqueName: \"kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.901378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdtw\" (UniqueName: \"kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.902253 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.902717 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.912040 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.917884 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxz5\" (UniqueName: \"kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5\") pod \"neutron-db-create-j9qgq\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:58 crc kubenswrapper[5033]: I0319 19:14:58.917919 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvjp\" (UniqueName: \"kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp\") pod \"cloudkitty-292e-account-create-update-vzxnc\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.003358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdtw\" (UniqueName: \"kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.003490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.004164 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.018624 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdtw\" (UniqueName: \"kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw\") pod \"neutron-5391-account-create-update-wd7jh\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.085988 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.103017 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j9qgq" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.163272 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:14:59 crc kubenswrapper[5033]: I0319 19:14:59.881534 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.137441 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf"] Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.139229 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.141908 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.146365 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf"] Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.147014 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.225486 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.225604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7nml\" (UniqueName: \"kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.225680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.326864 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.326935 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.327024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7nml\" (UniqueName: \"kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.327890 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.332170 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.353307 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7nml\" (UniqueName: \"kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml\") pod \"collect-profiles-29565795-78mtf\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.462932 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:00 crc kubenswrapper[5033]: I0319 19:15:00.548937 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8z6ts" podUID="ebcc8953-fc35-48d7-a3fd-be1a2291c08c" containerName="ovn-controller" probeResult="failure" output=< Mar 19 19:15:00 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 19:15:00 crc kubenswrapper[5033]: > Mar 19 19:15:02 crc kubenswrapper[5033]: I0319 19:15:02.879860 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": dial tcp 10.217.0.116:9090: connect: connection refused" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.674758 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.819750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.819905 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.819931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.819951 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.819972 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wl4\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820023 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820051 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820131 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820224 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820292 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets\") pod \"ffd2aa46-7091-4165-9da4-248c04907907\" (UID: \"ffd2aa46-7091-4165-9da4-248c04907907\") " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820590 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.820785 5033 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.821334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.821738 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.826204 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a91fda80-4324-4015-a32f-3396d6d2da1d-etc-swift\") pod \"swift-storage-0\" (UID: \"a91fda80-4324-4015-a32f-3396d6d2da1d\") " pod="openstack/swift-storage-0" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.830777 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out" (OuterVolumeSpecName: "config-out") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.835049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.839639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config" (OuterVolumeSpecName: "config") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.841246 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.844873 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ffd2aa46-7091-4165-9da4-248c04907907","Type":"ContainerDied","Data":"a311d0dd03776dcff7bf83d71d6264d533e4ef00b6678e96e932188531ec4f5f"} Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.844920 5033 scope.go:117] "RemoveContainer" containerID="d043f8652fccf13947db9dfaf6ba603dd3045334dfdccda82a537495345d17f6" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.845078 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.847976 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4" (OuterVolumeSpecName: "kube-api-access-d8wl4") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "kube-api-access-d8wl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.852973 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config" (OuterVolumeSpecName: "web-config") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.868714 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.872231 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "ffd2aa46-7091-4165-9da4-248c04907907" (UID: "ffd2aa46-7091-4165-9da4-248c04907907"). InnerVolumeSpecName "pvc-e2532600-f42e-4d76-80ca-02b9e884ca72". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.917556 5033 scope.go:117] "RemoveContainer" containerID="0ad19061be5c068775ac45b1d7b3a54c96ea4c32f9213b208bdd5f0b117543d9" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921357 5033 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921384 5033 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921395 5033 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ffd2aa46-7091-4165-9da4-248c04907907-config-out\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921406 5033 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921414 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wl4\" (UniqueName: \"kubernetes.io/projected/ffd2aa46-7091-4165-9da4-248c04907907-kube-api-access-d8wl4\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921444 5033 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-web-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921514 5033 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ffd2aa46-7091-4165-9da4-248c04907907-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921523 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffd2aa46-7091-4165-9da4-248c04907907-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.921557 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") on node \"crc\" " Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.949653 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.949806 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e2532600-f42e-4d76-80ca-02b9e884ca72" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72") on node "crc" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.965783 5033 scope.go:117] "RemoveContainer" containerID="d37758e1bf5a656491966c2f217d34d31f9f7ab73b8d88a435ebe4b1c85983d3" Mar 19 19:15:04 crc kubenswrapper[5033]: I0319 19:15:04.997641 5033 scope.go:117] "RemoveContainer" containerID="7cd9d9fb85ad22b69c81922887eac2450b565e11286fc586b647ddcee1a10ecd" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.024159 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.041773 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6fd1-account-create-update-pc8fs"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.185498 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.195546 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216131 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:15:05 crc kubenswrapper[5033]: E0319 19:15:05.216522 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="thanos-sidecar" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216537 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="thanos-sidecar" Mar 19 19:15:05 crc kubenswrapper[5033]: E0319 19:15:05.216558 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="config-reloader" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216565 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="config-reloader" Mar 19 19:15:05 crc kubenswrapper[5033]: E0319 19:15:05.216574 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216582 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" Mar 19 19:15:05 crc kubenswrapper[5033]: E0319 19:15:05.216592 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="init-config-reloader" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216598 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="init-config-reloader" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216757 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="thanos-sidecar" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216776 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="config-reloader" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.216784 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd2aa46-7091-4165-9da4-248c04907907" containerName="prometheus" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.218414 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.226935 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.228013 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.228091 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.228170 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.228251 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-59b22" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.232377 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.232416 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.232677 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.236329 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.247803 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334033 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334197 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc96\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-kube-api-access-lfc96\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334429 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334476 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334549 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334572 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.334749 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436333 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436462 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436549 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436569 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436603 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436622 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436640 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc96\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-kube-api-access-lfc96\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.436745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.439267 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.439381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.439835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.442092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.442490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.442948 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.443912 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.444814 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.445141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.447159 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.447667 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.448822 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.448864 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/166953177fb20786f8e1d18631ecc7a8cdf1ccf34ca7e3b1bfc1a12ac011aaeb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.459915 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc96\" (UniqueName: \"kubernetes.io/projected/1d0456b5-83d8-48b0-84a0-2d4d60604b9b-kube-api-access-lfc96\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.505638 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2532600-f42e-4d76-80ca-02b9e884ca72\") pod \"prometheus-metric-storage-0\" (UID: \"1d0456b5-83d8-48b0-84a0-2d4d60604b9b\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.531178 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8z6ts" podUID="ebcc8953-fc35-48d7-a3fd-be1a2291c08c" containerName="ovn-controller" probeResult="failure" output=< Mar 19 19:15:05 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 19:15:05 crc kubenswrapper[5033]: > Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.645851 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.768420 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-292e-account-create-update-vzxnc"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.816530 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kpzv2"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.851876 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6fqmp"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.894428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mj7zf" event={"ID":"8112e00f-afca-47a3-b233-b8282cccf396","Type":"ContainerStarted","Data":"6cc5c3b8126b2c8c1fa31e4b66fc877eb0ed4271f9e8bff59b84c82196b2b2c3"} Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.896560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kpzv2" event={"ID":"b4633785-c4bb-4b69-9383-e479734c029f","Type":"ContainerStarted","Data":"8ff8f041569408690ef5a6df3cdf239eef1f6c81db71f112c984990f60c326bb"} Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.897523 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.897549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" event={"ID":"1c873387-43a8-4d46-9499-8625c10a5e6d","Type":"ContainerStarted","Data":"6cf782beaa21998afb39c612770fd6fc3cb0fd270be1596c05ff81af640f09ae"} Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.901035 5033 generic.go:334] "Generic (PLEG): container finished" podID="1d403cb3-1990-4495-8979-0b3f0593ccc1" containerID="3a08f21d3ea9528e3e93595063a06df4de4143e89a4f3a94ce3997861bbfe82b" exitCode=0 Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.901160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6fd1-account-create-update-pc8fs" event={"ID":"1d403cb3-1990-4495-8979-0b3f0593ccc1","Type":"ContainerDied","Data":"3a08f21d3ea9528e3e93595063a06df4de4143e89a4f3a94ce3997861bbfe82b"} Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.901201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6fd1-account-create-update-pc8fs" event={"ID":"1d403cb3-1990-4495-8979-0b3f0593ccc1","Type":"ContainerStarted","Data":"49814aa18d1715d4614e45a527146e1ecf82f25ee06418217be64313d00db5d7"} Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.910254 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-v74jb"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.918101 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qc584"] Mar 19 19:15:05 crc kubenswrapper[5033]: W0319 19:15:05.925276 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd2ae324_efc6_404f_8e86_188b2e99710c.slice/crio-42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86 WatchSource:0}: Error finding container 42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86: Status 404 returned error can't find the container with id 42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86 Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.946786 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j9qgq"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.957606 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts-config-8nqqm"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.968495 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e4d-account-create-update-bccmp"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.977797 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xnstv"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.990776 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5391-account-create-update-wd7jh"] Mar 19 19:15:05 crc kubenswrapper[5033]: I0319 19:15:05.993829 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mj7zf" podStartSLOduration=3.750937781 podStartE2EDuration="16.993808058s" podCreationTimestamp="2026-03-19 19:14:49 +0000 UTC" firstStartedPulling="2026-03-19 19:14:51.239708946 +0000 UTC m=+1101.344738795" lastFinishedPulling="2026-03-19 19:15:04.482579233 +0000 UTC m=+1114.587609072" observedRunningTime="2026-03-19 19:15:05.910738353 +0000 UTC m=+1116.015768202" watchObservedRunningTime="2026-03-19 19:15:05.993808058 +0000 UTC m=+1116.098837907" Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.195414 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:15:06 crc kubenswrapper[5033]: W0319 19:15:06.225066 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0456b5_83d8_48b0_84a0_2d4d60604b9b.slice/crio-81323f4accc6126b6af6236809e9dac1fc825ef8726d65f31290502ebc5266a7 WatchSource:0}: Error finding container 81323f4accc6126b6af6236809e9dac1fc825ef8726d65f31290502ebc5266a7: Status 404 returned error can't find the container with id 81323f4accc6126b6af6236809e9dac1fc825ef8726d65f31290502ebc5266a7 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.255536 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:15:06 crc kubenswrapper[5033]: W0319 19:15:06.280554 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda91fda80_4324_4015_a32f_3396d6d2da1d.slice/crio-c6c447614963fdad8933101476176f362037b4e983a2ec466fd781231cf1e62e WatchSource:0}: Error finding container c6c447614963fdad8933101476176f362037b4e983a2ec466fd781231cf1e62e: Status 404 returned error can't find the container with id c6c447614963fdad8933101476176f362037b4e983a2ec466fd781231cf1e62e Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.635019 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd2aa46-7091-4165-9da4-248c04907907" path="/var/lib/kubelet/pods/ffd2aa46-7091-4165-9da4-248c04907907/volumes" Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.929359 5033 generic.go:334] "Generic (PLEG): container finished" podID="c8e994ae-73e2-439b-bb97-c00e133d03cb" containerID="97aa45a9df5b72d0bf0a298a212e82dd7ad883939a6cdb9e0379358ce8048c49" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.929504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j9qgq" event={"ID":"c8e994ae-73e2-439b-bb97-c00e133d03cb","Type":"ContainerDied","Data":"97aa45a9df5b72d0bf0a298a212e82dd7ad883939a6cdb9e0379358ce8048c49"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.929908 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j9qgq" event={"ID":"c8e994ae-73e2-439b-bb97-c00e133d03cb","Type":"ContainerStarted","Data":"6d8697a87767dfd4a97d4dbd0a2038ff25444da07e1370a55ed7fa23299a66bc"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.932068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerStarted","Data":"81323f4accc6126b6af6236809e9dac1fc825ef8726d65f31290502ebc5266a7"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.933385 5033 generic.go:334] "Generic (PLEG): container finished" podID="a839076c-968b-4a81-9e49-f6c691b95dd6" containerID="013ecc38ffbe5625f4b21a384258763e8f955c6fe52b0c4583f257050a99ea8d" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.933493 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v74jb" event={"ID":"a839076c-968b-4a81-9e49-f6c691b95dd6","Type":"ContainerDied","Data":"013ecc38ffbe5625f4b21a384258763e8f955c6fe52b0c4583f257050a99ea8d"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.933547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v74jb" event={"ID":"a839076c-968b-4a81-9e49-f6c691b95dd6","Type":"ContainerStarted","Data":"ab9a7fc7a425cee5df2a4491bbd95656962456153437d1687a71f867dec7fe22"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.935471 5033 generic.go:334] "Generic (PLEG): container finished" podID="962b3cc1-defe-429e-a24c-68c3bec382fe" containerID="f5c65fc239aa141290d587868c68e28894830eccf2b2950cd5d9df1831661065" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.935524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" event={"ID":"962b3cc1-defe-429e-a24c-68c3bec382fe","Type":"ContainerDied","Data":"f5c65fc239aa141290d587868c68e28894830eccf2b2950cd5d9df1831661065"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.935541 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" event={"ID":"962b3cc1-defe-429e-a24c-68c3bec382fe","Type":"ContainerStarted","Data":"9e7b6f22c51d19814ae7e3a51df701f28e4b165eaaa64b21a23501deebbb786c"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.937290 5033 generic.go:334] "Generic (PLEG): container finished" podID="8a241c55-520e-441e-97fa-137e8161daa3" containerID="ec3c35d6f5f4748a119ed1fec4639a9c30c8d7239c0a61dbdcd0302a98af604c" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.937328 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5391-account-create-update-wd7jh" event={"ID":"8a241c55-520e-441e-97fa-137e8161daa3","Type":"ContainerDied","Data":"ec3c35d6f5f4748a119ed1fec4639a9c30c8d7239c0a61dbdcd0302a98af604c"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.937362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5391-account-create-update-wd7jh" event={"ID":"8a241c55-520e-441e-97fa-137e8161daa3","Type":"ContainerStarted","Data":"88f5e88711f69cc0c0a1dedb02e776a1e920cda67713e36b3a23456d9b09d2d1"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.938777 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"c6c447614963fdad8933101476176f362037b4e983a2ec466fd781231cf1e62e"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.945489 5033 generic.go:334] "Generic (PLEG): container finished" podID="2fa4e771-cd1c-490f-bda5-4b30be140739" containerID="92374f44c7cd1850a79ddede9b9a4f43d3739e54fb9ed3df3f64befbc38bed71" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.945563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xnstv" event={"ID":"2fa4e771-cd1c-490f-bda5-4b30be140739","Type":"ContainerDied","Data":"92374f44c7cd1850a79ddede9b9a4f43d3739e54fb9ed3df3f64befbc38bed71"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.945598 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xnstv" event={"ID":"2fa4e771-cd1c-490f-bda5-4b30be140739","Type":"ContainerStarted","Data":"121765c7e21ca8d6241559c06e939efe774725736a91083b915fd91f915410ae"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.951335 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-8nqqm" event={"ID":"cd2ae324-efc6-404f-8e86-188b2e99710c","Type":"ContainerStarted","Data":"7c07c80951690c8f39ec1c80441ae37f6fbc5de0ce477c5c9090a5455c4dc465"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.951404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-8nqqm" event={"ID":"cd2ae324-efc6-404f-8e86-188b2e99710c","Type":"ContainerStarted","Data":"42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.958099 5033 generic.go:334] "Generic (PLEG): container finished" podID="1c873387-43a8-4d46-9499-8625c10a5e6d" containerID="f86cff58c74f47f0e364cdf06382332179b7eb8928d038c55623ea9f670b3098" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.958175 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" event={"ID":"1c873387-43a8-4d46-9499-8625c10a5e6d","Type":"ContainerDied","Data":"f86cff58c74f47f0e364cdf06382332179b7eb8928d038c55623ea9f670b3098"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.967193 5033 generic.go:334] "Generic (PLEG): container finished" podID="64138b55-5be9-44e1-9663-5789ff9b51fd" containerID="f1196529a345f04ce08891b6e1fab7d1896f67f80406e51b21c1e97290265e7e" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.967324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e4d-account-create-update-bccmp" event={"ID":"64138b55-5be9-44e1-9663-5789ff9b51fd","Type":"ContainerDied","Data":"f1196529a345f04ce08891b6e1fab7d1896f67f80406e51b21c1e97290265e7e"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.967361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e4d-account-create-update-bccmp" event={"ID":"64138b55-5be9-44e1-9663-5789ff9b51fd","Type":"ContainerStarted","Data":"48d0ee59f3ac4c828e7a05ebd1b0b43cc2497d741c48a2c3cfd5577ff2798b82"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.977847 5033 generic.go:334] "Generic (PLEG): container finished" podID="b518db5b-e766-4d03-94b4-b3d72b3edbae" containerID="4a37194d6d76a51ef090bb4212f544a6651602c96b5f61fba76eda4bb51794b4" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.977935 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fqmp" event={"ID":"b518db5b-e766-4d03-94b4-b3d72b3edbae","Type":"ContainerDied","Data":"4a37194d6d76a51ef090bb4212f544a6651602c96b5f61fba76eda4bb51794b4"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.977973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fqmp" event={"ID":"b518db5b-e766-4d03-94b4-b3d72b3edbae","Type":"ContainerStarted","Data":"fb1f2e9b40542dbf7191ffe04280c583beaeafb83c106f1a2099535429842b14"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.985414 5033 generic.go:334] "Generic (PLEG): container finished" podID="0b698449-e87b-49b6-81d8-f101ce8304c9" containerID="44c231df158f80b1b4587398f06347870bb684500186d6aa5fcff8717a989758" exitCode=0 Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.985718 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qc584" event={"ID":"0b698449-e87b-49b6-81d8-f101ce8304c9","Type":"ContainerDied","Data":"44c231df158f80b1b4587398f06347870bb684500186d6aa5fcff8717a989758"} Mar 19 19:15:06 crc kubenswrapper[5033]: I0319 19:15:06.985743 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qc584" event={"ID":"0b698449-e87b-49b6-81d8-f101ce8304c9","Type":"ContainerStarted","Data":"987e88324ca95aac118f9e3c8a8d04f3b2e258e2853354e2951df6cf7a6c3b24"} Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.073859 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8z6ts-config-8nqqm" podStartSLOduration=12.073837424 podStartE2EDuration="12.073837424s" podCreationTimestamp="2026-03-19 19:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:07.065766557 +0000 UTC m=+1117.170796426" watchObservedRunningTime="2026-03-19 19:15:07.073837424 +0000 UTC m=+1117.178867273" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.339202 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.481696 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ctw2\" (UniqueName: \"kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2\") pod \"1d403cb3-1990-4495-8979-0b3f0593ccc1\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.481982 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts\") pod \"1d403cb3-1990-4495-8979-0b3f0593ccc1\" (UID: \"1d403cb3-1990-4495-8979-0b3f0593ccc1\") " Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.482771 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d403cb3-1990-4495-8979-0b3f0593ccc1" (UID: "1d403cb3-1990-4495-8979-0b3f0593ccc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.488208 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2" (OuterVolumeSpecName: "kube-api-access-2ctw2") pod "1d403cb3-1990-4495-8979-0b3f0593ccc1" (UID: "1d403cb3-1990-4495-8979-0b3f0593ccc1"). InnerVolumeSpecName "kube-api-access-2ctw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.583924 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ctw2\" (UniqueName: \"kubernetes.io/projected/1d403cb3-1990-4495-8979-0b3f0593ccc1-kube-api-access-2ctw2\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.583951 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d403cb3-1990-4495-8979-0b3f0593ccc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.998855 5033 generic.go:334] "Generic (PLEG): container finished" podID="cd2ae324-efc6-404f-8e86-188b2e99710c" containerID="7c07c80951690c8f39ec1c80441ae37f6fbc5de0ce477c5c9090a5455c4dc465" exitCode=0 Mar 19 19:15:07 crc kubenswrapper[5033]: I0319 19:15:07.998940 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-8nqqm" event={"ID":"cd2ae324-efc6-404f-8e86-188b2e99710c","Type":"ContainerDied","Data":"7c07c80951690c8f39ec1c80441ae37f6fbc5de0ce477c5c9090a5455c4dc465"} Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.002413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"a317d430bd171463779c004426a701e902aa391fc8ce88fbff1106345776f2e3"} Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.004151 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6fd1-account-create-update-pc8fs" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.004617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6fd1-account-create-update-pc8fs" event={"ID":"1d403cb3-1990-4495-8979-0b3f0593ccc1","Type":"ContainerDied","Data":"49814aa18d1715d4614e45a527146e1ecf82f25ee06418217be64313d00db5d7"} Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.004670 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49814aa18d1715d4614e45a527146e1ecf82f25ee06418217be64313d00db5d7" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.734695 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v74jb" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.739922 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qc584" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.805752 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts\") pod \"a839076c-968b-4a81-9e49-f6c691b95dd6\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.805860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l76db\" (UniqueName: \"kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db\") pod \"a839076c-968b-4a81-9e49-f6c691b95dd6\" (UID: \"a839076c-968b-4a81-9e49-f6c691b95dd6\") " Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.806585 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a839076c-968b-4a81-9e49-f6c691b95dd6" (UID: "a839076c-968b-4a81-9e49-f6c691b95dd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.907956 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts\") pod \"0b698449-e87b-49b6-81d8-f101ce8304c9\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.908011 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6q5m\" (UniqueName: \"kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m\") pod \"0b698449-e87b-49b6-81d8-f101ce8304c9\" (UID: \"0b698449-e87b-49b6-81d8-f101ce8304c9\") " Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.908439 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a839076c-968b-4a81-9e49-f6c691b95dd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.908951 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b698449-e87b-49b6-81d8-f101ce8304c9" (UID: "0b698449-e87b-49b6-81d8-f101ce8304c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.979716 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db" (OuterVolumeSpecName: "kube-api-access-l76db") pod "a839076c-968b-4a81-9e49-f6c691b95dd6" (UID: "a839076c-968b-4a81-9e49-f6c691b95dd6"). InnerVolumeSpecName "kube-api-access-l76db". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:08 crc kubenswrapper[5033]: I0319 19:15:08.979786 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m" (OuterVolumeSpecName: "kube-api-access-c6q5m") pod "0b698449-e87b-49b6-81d8-f101ce8304c9" (UID: "0b698449-e87b-49b6-81d8-f101ce8304c9"). InnerVolumeSpecName "kube-api-access-c6q5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.009648 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b698449-e87b-49b6-81d8-f101ce8304c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.009677 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6q5m\" (UniqueName: \"kubernetes.io/projected/0b698449-e87b-49b6-81d8-f101ce8304c9-kube-api-access-c6q5m\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.009689 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l76db\" (UniqueName: \"kubernetes.io/projected/a839076c-968b-4a81-9e49-f6c691b95dd6-kube-api-access-l76db\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.033745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"63e78dc09a8a738daa111b39336d34034e83e6ad7594e373c189287a0644c713"} Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.035721 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qc584" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.035675 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qc584" event={"ID":"0b698449-e87b-49b6-81d8-f101ce8304c9","Type":"ContainerDied","Data":"987e88324ca95aac118f9e3c8a8d04f3b2e258e2853354e2951df6cf7a6c3b24"} Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.035768 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987e88324ca95aac118f9e3c8a8d04f3b2e258e2853354e2951df6cf7a6c3b24" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.037966 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-v74jb" event={"ID":"a839076c-968b-4a81-9e49-f6c691b95dd6","Type":"ContainerDied","Data":"ab9a7fc7a425cee5df2a4491bbd95656962456153437d1687a71f867dec7fe22"} Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.037995 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-v74jb" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.038002 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9a7fc7a425cee5df2a4491bbd95656962456153437d1687a71f867dec7fe22" Mar 19 19:15:09 crc kubenswrapper[5033]: I0319 19:15:09.893275 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="31733aba-46c2-4129-9088-e294daafa285" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.048671 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerStarted","Data":"9f4440f2b382e9bcb56bfa0e5fa9f5a6b32444149da70854e166254987d471fb"} Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.512947 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8z6ts" Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.759228 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.759606 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.759653 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.760487 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:15:10 crc kubenswrapper[5033]: I0319 19:15:10.760558 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750" gracePeriod=600 Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.059915 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750" exitCode=0 Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.059974 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750"} Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.060030 5033 scope.go:117] "RemoveContainer" containerID="060e13ea929f52922bce7770de4d206705d5b363635c1a012d7934ab606c36d8" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.862566 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fqmp" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.902634 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.944421 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.950859 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j9qgq" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.962324 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.980786 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:11 crc kubenswrapper[5033]: I0319 19:15:11.997629 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.009178 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts\") pod \"2fa4e771-cd1c-490f-bda5-4b30be140739\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.009250 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qptzc\" (UniqueName: \"kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc\") pod \"2fa4e771-cd1c-490f-bda5-4b30be140739\" (UID: \"2fa4e771-cd1c-490f-bda5-4b30be140739\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.009412 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqqg9\" (UniqueName: \"kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9\") pod \"b518db5b-e766-4d03-94b4-b3d72b3edbae\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.009548 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts\") pod \"b518db5b-e766-4d03-94b4-b3d72b3edbae\" (UID: \"b518db5b-e766-4d03-94b4-b3d72b3edbae\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.011686 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b518db5b-e766-4d03-94b4-b3d72b3edbae" (UID: "b518db5b-e766-4d03-94b4-b3d72b3edbae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.012949 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2fa4e771-cd1c-490f-bda5-4b30be140739" (UID: "2fa4e771-cd1c-490f-bda5-4b30be140739"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.018157 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.024203 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9" (OuterVolumeSpecName: "kube-api-access-vqqg9") pod "b518db5b-e766-4d03-94b4-b3d72b3edbae" (UID: "b518db5b-e766-4d03-94b4-b3d72b3edbae"). InnerVolumeSpecName "kube-api-access-vqqg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.031510 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc" (OuterVolumeSpecName: "kube-api-access-qptzc") pod "2fa4e771-cd1c-490f-bda5-4b30be140739" (UID: "2fa4e771-cd1c-490f-bda5-4b30be140739"). InnerVolumeSpecName "kube-api-access-qptzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.093012 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e4d-account-create-update-bccmp" event={"ID":"64138b55-5be9-44e1-9663-5789ff9b51fd","Type":"ContainerDied","Data":"48d0ee59f3ac4c828e7a05ebd1b0b43cc2497d741c48a2c3cfd5577ff2798b82"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.094228 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d0ee59f3ac4c828e7a05ebd1b0b43cc2497d741c48a2c3cfd5577ff2798b82" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.093253 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e4d-account-create-update-bccmp" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.097932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fqmp" event={"ID":"b518db5b-e766-4d03-94b4-b3d72b3edbae","Type":"ContainerDied","Data":"fb1f2e9b40542dbf7191ffe04280c583beaeafb83c106f1a2099535429842b14"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.098217 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1f2e9b40542dbf7191ffe04280c583beaeafb83c106f1a2099535429842b14" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.097953 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fqmp" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.101269 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xnstv" event={"ID":"2fa4e771-cd1c-490f-bda5-4b30be140739","Type":"ContainerDied","Data":"121765c7e21ca8d6241559c06e939efe774725736a91083b915fd91f915410ae"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.101327 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="121765c7e21ca8d6241559c06e939efe774725736a91083b915fd91f915410ae" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.101429 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xnstv" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111226 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvjp\" (UniqueName: \"kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp\") pod \"1c873387-43a8-4d46-9499-8625c10a5e6d\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111287 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111392 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts\") pod \"c8e994ae-73e2-439b-bb97-c00e133d03cb\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111413 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksdtw\" (UniqueName: \"kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw\") pod \"8a241c55-520e-441e-97fa-137e8161daa3\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111506 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gxz5\" (UniqueName: \"kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5\") pod \"c8e994ae-73e2-439b-bb97-c00e133d03cb\" (UID: \"c8e994ae-73e2-439b-bb97-c00e133d03cb\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts\") pod \"64138b55-5be9-44e1-9663-5789ff9b51fd\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111625 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts\") pod \"8a241c55-520e-441e-97fa-137e8161daa3\" (UID: \"8a241c55-520e-441e-97fa-137e8161daa3\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111678 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts\") pod \"1c873387-43a8-4d46-9499-8625c10a5e6d\" (UID: \"1c873387-43a8-4d46-9499-8625c10a5e6d\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111712 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111824 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkch\" (UniqueName: \"kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch\") pod \"64138b55-5be9-44e1-9663-5789ff9b51fd\" (UID: \"64138b55-5be9-44e1-9663-5789ff9b51fd\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111958 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7nml\" (UniqueName: \"kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml\") pod \"962b3cc1-defe-429e-a24c-68c3bec382fe\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112014 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume\") pod \"962b3cc1-defe-429e-a24c-68c3bec382fe\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112044 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbg49\" (UniqueName: \"kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112082 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts\") pod \"cd2ae324-efc6-404f-8e86-188b2e99710c\" (UID: \"cd2ae324-efc6-404f-8e86-188b2e99710c\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112110 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume\") pod \"962b3cc1-defe-429e-a24c-68c3bec382fe\" (UID: \"962b3cc1-defe-429e-a24c-68c3bec382fe\") " Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112254 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run" (OuterVolumeSpecName: "var-run") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112662 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fa4e771-cd1c-490f-bda5-4b30be140739-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112682 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qptzc\" (UniqueName: \"kubernetes.io/projected/2fa4e771-cd1c-490f-bda5-4b30be140739-kube-api-access-qptzc\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112696 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqqg9\" (UniqueName: \"kubernetes.io/projected/b518db5b-e766-4d03-94b4-b3d72b3edbae-kube-api-access-vqqg9\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112740 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.112751 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b518db5b-e766-4d03-94b4-b3d72b3edbae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.113003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64138b55-5be9-44e1-9663-5789ff9b51fd" (UID: "64138b55-5be9-44e1-9663-5789ff9b51fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.113016 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a241c55-520e-441e-97fa-137e8161daa3" (UID: "8a241c55-520e-441e-97fa-137e8161daa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.113400 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c873387-43a8-4d46-9499-8625c10a5e6d" (UID: "1c873387-43a8-4d46-9499-8625c10a5e6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.113463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111717 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-8nqqm" event={"ID":"cd2ae324-efc6-404f-8e86-188b2e99710c","Type":"ContainerDied","Data":"42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.113496 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42f6ee14d7965e963cc657357e3a89e14357eb5f3adb613b525055f91c47ce86" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.114535 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.114902 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "962b3cc1-defe-429e-a24c-68c3bec382fe" (UID: "962b3cc1-defe-429e-a24c-68c3bec382fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.111827 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-8nqqm" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.115221 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8e994ae-73e2-439b-bb97-c00e133d03cb" (UID: "c8e994ae-73e2-439b-bb97-c00e133d03cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.116087 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.116396 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts" (OuterVolumeSpecName: "scripts") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.116577 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5" (OuterVolumeSpecName: "kube-api-access-4gxz5") pod "c8e994ae-73e2-439b-bb97-c00e133d03cb" (UID: "c8e994ae-73e2-439b-bb97-c00e133d03cb"). InnerVolumeSpecName "kube-api-access-4gxz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.116948 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp" (OuterVolumeSpecName: "kube-api-access-ddvjp") pod "1c873387-43a8-4d46-9499-8625c10a5e6d" (UID: "1c873387-43a8-4d46-9499-8625c10a5e6d"). InnerVolumeSpecName "kube-api-access-ddvjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.119508 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw" (OuterVolumeSpecName: "kube-api-access-ksdtw") pod "8a241c55-520e-441e-97fa-137e8161daa3" (UID: "8a241c55-520e-441e-97fa-137e8161daa3"). InnerVolumeSpecName "kube-api-access-ksdtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.119555 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"cb8a78e3772704c15d182b3715616151be968dbd1903371706732790f15d614d"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.119601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch" (OuterVolumeSpecName: "kube-api-access-djkch") pod "64138b55-5be9-44e1-9663-5789ff9b51fd" (UID: "64138b55-5be9-44e1-9663-5789ff9b51fd"). InnerVolumeSpecName "kube-api-access-djkch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.119654 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "962b3cc1-defe-429e-a24c-68c3bec382fe" (UID: "962b3cc1-defe-429e-a24c-68c3bec382fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.122521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" event={"ID":"1c873387-43a8-4d46-9499-8625c10a5e6d","Type":"ContainerDied","Data":"6cf782beaa21998afb39c612770fd6fc3cb0fd270be1596c05ff81af640f09ae"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.122571 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf782beaa21998afb39c612770fd6fc3cb0fd270be1596c05ff81af640f09ae" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.122605 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml" (OuterVolumeSpecName: "kube-api-access-k7nml") pod "962b3cc1-defe-429e-a24c-68c3bec382fe" (UID: "962b3cc1-defe-429e-a24c-68c3bec382fe"). InnerVolumeSpecName "kube-api-access-k7nml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.122654 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-292e-account-create-update-vzxnc" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.125813 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.126188 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49" (OuterVolumeSpecName: "kube-api-access-jbg49") pod "cd2ae324-efc6-404f-8e86-188b2e99710c" (UID: "cd2ae324-efc6-404f-8e86-188b2e99710c"). InnerVolumeSpecName "kube-api-access-jbg49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.129419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kpzv2" event={"ID":"b4633785-c4bb-4b69-9383-e479734c029f","Type":"ContainerStarted","Data":"3265d8b4452c653630bb972dab0f49ae5a5c4e688f18570c6c3cbbbeecb9f7f3"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.133589 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" event={"ID":"962b3cc1-defe-429e-a24c-68c3bec382fe","Type":"ContainerDied","Data":"9e7b6f22c51d19814ae7e3a51df701f28e4b165eaaa64b21a23501deebbb786c"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.133820 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7b6f22c51d19814ae7e3a51df701f28e4b165eaaa64b21a23501deebbb786c" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.134103 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.136746 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5391-account-create-update-wd7jh" event={"ID":"8a241c55-520e-441e-97fa-137e8161daa3","Type":"ContainerDied","Data":"88f5e88711f69cc0c0a1dedb02e776a1e920cda67713e36b3a23456d9b09d2d1"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.136784 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f5e88711f69cc0c0a1dedb02e776a1e920cda67713e36b3a23456d9b09d2d1" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.136864 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5391-account-create-update-wd7jh" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.151429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j9qgq" event={"ID":"c8e994ae-73e2-439b-bb97-c00e133d03cb","Type":"ContainerDied","Data":"6d8697a87767dfd4a97d4dbd0a2038ff25444da07e1370a55ed7fa23299a66bc"} Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.151487 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8697a87767dfd4a97d4dbd0a2038ff25444da07e1370a55ed7fa23299a66bc" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.151561 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j9qgq" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.166283 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kpzv2" podStartSLOduration=8.270468353 podStartE2EDuration="14.166253062s" podCreationTimestamp="2026-03-19 19:14:58 +0000 UTC" firstStartedPulling="2026-03-19 19:15:05.867803836 +0000 UTC m=+1115.972833685" lastFinishedPulling="2026-03-19 19:15:11.763588545 +0000 UTC m=+1121.868618394" observedRunningTime="2026-03-19 19:15:12.162180338 +0000 UTC m=+1122.267210197" watchObservedRunningTime="2026-03-19 19:15:12.166253062 +0000 UTC m=+1122.271282911" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.214973 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gxz5\" (UniqueName: \"kubernetes.io/projected/c8e994ae-73e2-439b-bb97-c00e133d03cb-kube-api-access-4gxz5\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215006 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64138b55-5be9-44e1-9663-5789ff9b51fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215015 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a241c55-520e-441e-97fa-137e8161daa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215024 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c873387-43a8-4d46-9499-8625c10a5e6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215033 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215044 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkch\" (UniqueName: \"kubernetes.io/projected/64138b55-5be9-44e1-9663-5789ff9b51fd-kube-api-access-djkch\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215052 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7nml\" (UniqueName: \"kubernetes.io/projected/962b3cc1-defe-429e-a24c-68c3bec382fe-kube-api-access-k7nml\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215060 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/962b3cc1-defe-429e-a24c-68c3bec382fe-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215068 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbg49\" (UniqueName: \"kubernetes.io/projected/cd2ae324-efc6-404f-8e86-188b2e99710c-kube-api-access-jbg49\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215076 5033 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215083 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/962b3cc1-defe-429e-a24c-68c3bec382fe-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215091 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvjp\" (UniqueName: \"kubernetes.io/projected/1c873387-43a8-4d46-9499-8625c10a5e6d-kube-api-access-ddvjp\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215100 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd2ae324-efc6-404f-8e86-188b2e99710c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215108 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cd2ae324-efc6-404f-8e86-188b2e99710c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215115 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8e994ae-73e2-439b-bb97-c00e133d03cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:12 crc kubenswrapper[5033]: I0319 19:15:12.215125 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksdtw\" (UniqueName: \"kubernetes.io/projected/8a241c55-520e-441e-97fa-137e8161daa3-kube-api-access-ksdtw\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.178079 5033 generic.go:334] "Generic (PLEG): container finished" podID="8112e00f-afca-47a3-b233-b8282cccf396" containerID="6cc5c3b8126b2c8c1fa31e4b66fc877eb0ed4271f9e8bff59b84c82196b2b2c3" exitCode=0 Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.178391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mj7zf" event={"ID":"8112e00f-afca-47a3-b233-b8282cccf396","Type":"ContainerDied","Data":"6cc5c3b8126b2c8c1fa31e4b66fc877eb0ed4271f9e8bff59b84c82196b2b2c3"} Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.192289 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"a306feb41cfb638d7dfa06b9371f9c639727a821bb8f688a26a180b87e27adfb"} Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.200003 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8z6ts-config-8nqqm"] Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.220511 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8z6ts-config-8nqqm"] Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.399332 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8z6ts-config-gdvsz"] Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.399967 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a241c55-520e-441e-97fa-137e8161daa3" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.399984 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a241c55-520e-441e-97fa-137e8161daa3" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.399998 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa4e771-cd1c-490f-bda5-4b30be140739" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400006 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa4e771-cd1c-490f-bda5-4b30be140739" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400017 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b518db5b-e766-4d03-94b4-b3d72b3edbae" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400023 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b518db5b-e766-4d03-94b4-b3d72b3edbae" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400034 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b698449-e87b-49b6-81d8-f101ce8304c9" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400040 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b698449-e87b-49b6-81d8-f101ce8304c9" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400053 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2ae324-efc6-404f-8e86-188b2e99710c" containerName="ovn-config" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400058 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2ae324-efc6-404f-8e86-188b2e99710c" containerName="ovn-config" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400069 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e994ae-73e2-439b-bb97-c00e133d03cb" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400099 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e994ae-73e2-439b-bb97-c00e133d03cb" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400110 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d403cb3-1990-4495-8979-0b3f0593ccc1" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400116 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d403cb3-1990-4495-8979-0b3f0593ccc1" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400130 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a839076c-968b-4a81-9e49-f6c691b95dd6" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400138 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a839076c-968b-4a81-9e49-f6c691b95dd6" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400150 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962b3cc1-defe-429e-a24c-68c3bec382fe" containerName="collect-profiles" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400155 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="962b3cc1-defe-429e-a24c-68c3bec382fe" containerName="collect-profiles" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400168 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c873387-43a8-4d46-9499-8625c10a5e6d" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400174 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c873387-43a8-4d46-9499-8625c10a5e6d" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: E0319 19:15:13.400181 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64138b55-5be9-44e1-9663-5789ff9b51fd" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400188 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64138b55-5be9-44e1-9663-5789ff9b51fd" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400345 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd2ae324-efc6-404f-8e86-188b2e99710c" containerName="ovn-config" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400376 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b698449-e87b-49b6-81d8-f101ce8304c9" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400391 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa4e771-cd1c-490f-bda5-4b30be140739" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400408 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c873387-43a8-4d46-9499-8625c10a5e6d" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400424 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d403cb3-1990-4495-8979-0b3f0593ccc1" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400433 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a839076c-968b-4a81-9e49-f6c691b95dd6" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400492 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e994ae-73e2-439b-bb97-c00e133d03cb" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400502 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64138b55-5be9-44e1-9663-5789ff9b51fd" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400516 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="962b3cc1-defe-429e-a24c-68c3bec382fe" containerName="collect-profiles" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400524 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b518db5b-e766-4d03-94b4-b3d72b3edbae" containerName="mariadb-database-create" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.400540 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a241c55-520e-441e-97fa-137e8161daa3" containerName="mariadb-account-create-update" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.401246 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.413269 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts-config-gdvsz"] Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.417987 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547023 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547212 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66f99\" (UniqueName: \"kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547243 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.547259 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649218 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66f99\" (UniqueName: \"kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649309 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649369 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649389 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649433 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649756 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649806 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.649851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.650233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.651594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.681748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66f99\" (UniqueName: \"kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99\") pod \"ovn-controller-8z6ts-config-gdvsz\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:13 crc kubenswrapper[5033]: I0319 19:15:13.717923 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:14 crc kubenswrapper[5033]: I0319 19:15:14.209872 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"ebf159c5e5e246f921c1d2dc69bfad75fecb3debc0d82009bb3a27caa2bb3071"} Mar 19 19:15:14 crc kubenswrapper[5033]: I0319 19:15:14.216115 5033 generic.go:334] "Generic (PLEG): container finished" podID="1d0456b5-83d8-48b0-84a0-2d4d60604b9b" containerID="9f4440f2b382e9bcb56bfa0e5fa9f5a6b32444149da70854e166254987d471fb" exitCode=0 Mar 19 19:15:14 crc kubenswrapper[5033]: I0319 19:15:14.216403 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerDied","Data":"9f4440f2b382e9bcb56bfa0e5fa9f5a6b32444149da70854e166254987d471fb"} Mar 19 19:15:14 crc kubenswrapper[5033]: I0319 19:15:14.286873 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8z6ts-config-gdvsz"] Mar 19 19:15:14 crc kubenswrapper[5033]: W0319 19:15:14.333654 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da6d037_70d9_48e5_9dd6_8c205f784235.slice/crio-37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a WatchSource:0}: Error finding container 37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a: Status 404 returned error can't find the container with id 37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a Mar 19 19:15:14 crc kubenswrapper[5033]: I0319 19:15:14.633338 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd2ae324-efc6-404f-8e86-188b2e99710c" path="/var/lib/kubelet/pods/cd2ae324-efc6-404f-8e86-188b2e99710c/volumes" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.161506 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mj7zf" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.227725 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"089b1f8b5ae3be77ebd8665dd6e49712b7e8ab8ce3b835875796c66808a5aa6f"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.227772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"e64e54991cab625f25c2959e161b325223ccf3bc6282fa4f2956ade96b581b6d"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.227787 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"04151ac1cb9f6ca3238cf6cb8f1fd9365f9a9c5bf9aaaef415815d48fa19a50b"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.230790 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerStarted","Data":"fb414a3f24148cd86dad032d0ee592a983616ef8e06e2897e39310a5b9158dfa"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.232425 5033 generic.go:334] "Generic (PLEG): container finished" podID="6da6d037-70d9-48e5-9dd6-8c205f784235" containerID="3ec877d37bd42b2eabbe46a9aa3fe785cc6c03ff251eef151e06328d630d0664" exitCode=0 Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.232495 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-gdvsz" event={"ID":"6da6d037-70d9-48e5-9dd6-8c205f784235","Type":"ContainerDied","Data":"3ec877d37bd42b2eabbe46a9aa3fe785cc6c03ff251eef151e06328d630d0664"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.232516 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-gdvsz" event={"ID":"6da6d037-70d9-48e5-9dd6-8c205f784235","Type":"ContainerStarted","Data":"37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.234314 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mj7zf" event={"ID":"8112e00f-afca-47a3-b233-b8282cccf396","Type":"ContainerDied","Data":"a63c135a757863b2f14a9f068393579deee95d9aaa899f3730840e42214875eb"} Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.234335 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63c135a757863b2f14a9f068393579deee95d9aaa899f3730840e42214875eb" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.234371 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mj7zf" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.282618 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9\") pod \"8112e00f-afca-47a3-b233-b8282cccf396\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.282764 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data\") pod \"8112e00f-afca-47a3-b233-b8282cccf396\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.283681 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data\") pod \"8112e00f-afca-47a3-b233-b8282cccf396\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.283712 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle\") pod \"8112e00f-afca-47a3-b233-b8282cccf396\" (UID: \"8112e00f-afca-47a3-b233-b8282cccf396\") " Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.287321 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9" (OuterVolumeSpecName: "kube-api-access-q64h9") pod "8112e00f-afca-47a3-b233-b8282cccf396" (UID: "8112e00f-afca-47a3-b233-b8282cccf396"). InnerVolumeSpecName "kube-api-access-q64h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.287321 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8112e00f-afca-47a3-b233-b8282cccf396" (UID: "8112e00f-afca-47a3-b233-b8282cccf396"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.307726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8112e00f-afca-47a3-b233-b8282cccf396" (UID: "8112e00f-afca-47a3-b233-b8282cccf396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.328210 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data" (OuterVolumeSpecName: "config-data") pod "8112e00f-afca-47a3-b233-b8282cccf396" (UID: "8112e00f-afca-47a3-b233-b8282cccf396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.385397 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.385432 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.385440 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8112e00f-afca-47a3-b233-b8282cccf396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.385463 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/8112e00f-afca-47a3-b233-b8282cccf396-kube-api-access-q64h9\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.572590 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:15 crc kubenswrapper[5033]: E0319 19:15:15.573044 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8112e00f-afca-47a3-b233-b8282cccf396" containerName="glance-db-sync" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.573060 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8112e00f-afca-47a3-b233-b8282cccf396" containerName="glance-db-sync" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.573236 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8112e00f-afca-47a3-b233-b8282cccf396" containerName="glance-db-sync" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.574275 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.597277 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.691579 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.691668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.691709 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.691737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.691782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7cx\" (UniqueName: \"kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.793466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.793510 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.793545 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.793573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.793624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7cx\" (UniqueName: \"kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.794561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.794777 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.794876 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.795279 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.818292 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7cx\" (UniqueName: \"kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx\") pod \"dnsmasq-dns-5b946c75cc-sbmvv\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:15 crc kubenswrapper[5033]: I0319 19:15:15.920570 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.288543 5033 generic.go:334] "Generic (PLEG): container finished" podID="b4633785-c4bb-4b69-9383-e479734c029f" containerID="3265d8b4452c653630bb972dab0f49ae5a5c4e688f18570c6c3cbbbeecb9f7f3" exitCode=0 Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.289019 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kpzv2" event={"ID":"b4633785-c4bb-4b69-9383-e479734c029f","Type":"ContainerDied","Data":"3265d8b4452c653630bb972dab0f49ae5a5c4e688f18570c6c3cbbbeecb9f7f3"} Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.651435 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.693565 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807898 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807935 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66f99\" (UniqueName: \"kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.807961 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808005 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts\") pod \"6da6d037-70d9-48e5-9dd6-8c205f784235\" (UID: \"6da6d037-70d9-48e5-9dd6-8c205f784235\") " Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808077 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run" (OuterVolumeSpecName: "var-run") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808494 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808518 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808530 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6da6d037-70d9-48e5-9dd6-8c205f784235-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.808733 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.809001 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts" (OuterVolumeSpecName: "scripts") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: W0319 19:15:16.884130 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e46793f_868f_42f3_85f4_007b0047583b.slice/crio-2ab9f5b8ad3be3764284c4495dff65a804e7551c4b7b9944110110ed9a0ab327 WatchSource:0}: Error finding container 2ab9f5b8ad3be3764284c4495dff65a804e7551c4b7b9944110110ed9a0ab327: Status 404 returned error can't find the container with id 2ab9f5b8ad3be3764284c4495dff65a804e7551c4b7b9944110110ed9a0ab327 Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.885003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99" (OuterVolumeSpecName: "kube-api-access-66f99") pod "6da6d037-70d9-48e5-9dd6-8c205f784235" (UID: "6da6d037-70d9-48e5-9dd6-8c205f784235"). InnerVolumeSpecName "kube-api-access-66f99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.909728 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66f99\" (UniqueName: \"kubernetes.io/projected/6da6d037-70d9-48e5-9dd6-8c205f784235-kube-api-access-66f99\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.909758 5033 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:16 crc kubenswrapper[5033]: I0319 19:15:16.909768 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6da6d037-70d9-48e5-9dd6-8c205f784235-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.310812 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8z6ts-config-gdvsz" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.310985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8z6ts-config-gdvsz" event={"ID":"6da6d037-70d9-48e5-9dd6-8c205f784235","Type":"ContainerDied","Data":"37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.312387 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b7beb19d3839881a1b6663c936e06e72d9c0fadf56483387451a595bd3313a" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.332704 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"664f78a2c0cd9647c5de2539b94698cde27d0693ce04bf4e0e01c8bbc1924538"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.332740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"8c81c007b0ae3402ea2a54955a84e0ee401a064bf6c402b2edbf51a109e33728"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.332751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"b1565dad9fc32dbf9ec692b3f6d01ea841881610449affadae6bee6b5a195389"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.336596 5033 generic.go:334] "Generic (PLEG): container finished" podID="4e46793f-868f-42f3-85f4-007b0047583b" containerID="603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf" exitCode=0 Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.336772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" event={"ID":"4e46793f-868f-42f3-85f4-007b0047583b","Type":"ContainerDied","Data":"603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.336799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" event={"ID":"4e46793f-868f-42f3-85f4-007b0047583b","Type":"ContainerStarted","Data":"2ab9f5b8ad3be3764284c4495dff65a804e7551c4b7b9944110110ed9a0ab327"} Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.737879 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8z6ts-config-gdvsz"] Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.751660 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8z6ts-config-gdvsz"] Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.810390 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.937814 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle\") pod \"b4633785-c4bb-4b69-9383-e479734c029f\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.937930 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data\") pod \"b4633785-c4bb-4b69-9383-e479734c029f\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.937973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rggz2\" (UniqueName: \"kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2\") pod \"b4633785-c4bb-4b69-9383-e479734c029f\" (UID: \"b4633785-c4bb-4b69-9383-e479734c029f\") " Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.941330 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2" (OuterVolumeSpecName: "kube-api-access-rggz2") pod "b4633785-c4bb-4b69-9383-e479734c029f" (UID: "b4633785-c4bb-4b69-9383-e479734c029f"). InnerVolumeSpecName "kube-api-access-rggz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.960163 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4633785-c4bb-4b69-9383-e479734c029f" (UID: "b4633785-c4bb-4b69-9383-e479734c029f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:17 crc kubenswrapper[5033]: I0319 19:15:17.982695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data" (OuterVolumeSpecName: "config-data") pod "b4633785-c4bb-4b69-9383-e479734c029f" (UID: "b4633785-c4bb-4b69-9383-e479734c029f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.040964 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rggz2\" (UniqueName: \"kubernetes.io/projected/b4633785-c4bb-4b69-9383-e479734c029f-kube-api-access-rggz2\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.041000 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.041009 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4633785-c4bb-4b69-9383-e479734c029f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.348633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerStarted","Data":"4ee9fc3781947cd41669ee403788b21b30f95846244159fb931c2c87c921ecdd"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.348685 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1d0456b5-83d8-48b0-84a0-2d4d60604b9b","Type":"ContainerStarted","Data":"972a154c4f6f0492badc7934e34a928b5a38631a6abc171cdea101922ad8f480"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.350709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kpzv2" event={"ID":"b4633785-c4bb-4b69-9383-e479734c029f","Type":"ContainerDied","Data":"8ff8f041569408690ef5a6df3cdf239eef1f6c81db71f112c984990f60c326bb"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.350744 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff8f041569408690ef5a6df3cdf239eef1f6c81db71f112c984990f60c326bb" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.350744 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kpzv2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.358546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"f0c6776c8672c01c8ac63bd828fd6561e3b3c7ba4e2b9485c77d541105c32124"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.358594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"37ced05ed53ee6165fb1a898198f0fc17d61babe3a32b1ef744d9777c06d6468"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.358606 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"0ff756452e8758203713cad223c0cb45d109367e7b621aad8793773830438143"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.358616 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a91fda80-4324-4015-a32f-3396d6d2da1d","Type":"ContainerStarted","Data":"1cd7e05ba21e26885a65ea0f5cf6a5f9f6d603152fdaf1088bf94db7eb2b1df0"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.360785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" event={"ID":"4e46793f-868f-42f3-85f4-007b0047583b","Type":"ContainerStarted","Data":"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8"} Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.360916 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.380180 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.380162922 podStartE2EDuration="13.380162922s" podCreationTimestamp="2026-03-19 19:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:18.378180606 +0000 UTC m=+1128.483210465" watchObservedRunningTime="2026-03-19 19:15:18.380162922 +0000 UTC m=+1128.485192771" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.420603 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.533653744 podStartE2EDuration="47.420582858s" podCreationTimestamp="2026-03-19 19:14:31 +0000 UTC" firstStartedPulling="2026-03-19 19:15:06.29417843 +0000 UTC m=+1116.399208279" lastFinishedPulling="2026-03-19 19:15:16.181107544 +0000 UTC m=+1126.286137393" observedRunningTime="2026-03-19 19:15:18.420367832 +0000 UTC m=+1128.525397681" watchObservedRunningTime="2026-03-19 19:15:18.420582858 +0000 UTC m=+1128.525612707" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.450964 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" podStartSLOduration=3.450946911 podStartE2EDuration="3.450946911s" podCreationTimestamp="2026-03-19 19:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:18.449944853 +0000 UTC m=+1128.554974702" watchObservedRunningTime="2026-03-19 19:15:18.450946911 +0000 UTC m=+1128.555976760" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.552431 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f6zh2"] Mar 19 19:15:18 crc kubenswrapper[5033]: E0319 19:15:18.552785 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da6d037-70d9-48e5-9dd6-8c205f784235" containerName="ovn-config" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.552800 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da6d037-70d9-48e5-9dd6-8c205f784235" containerName="ovn-config" Mar 19 19:15:18 crc kubenswrapper[5033]: E0319 19:15:18.552815 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4633785-c4bb-4b69-9383-e479734c029f" containerName="keystone-db-sync" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.552821 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4633785-c4bb-4b69-9383-e479734c029f" containerName="keystone-db-sync" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.552990 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da6d037-70d9-48e5-9dd6-8c205f784235" containerName="ovn-config" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.553002 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4633785-c4bb-4b69-9383-e479734c029f" containerName="keystone-db-sync" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.553584 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.558035 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.558255 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.558380 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.558550 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tm5hs" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.558672 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.614524 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.662676 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.663084 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6vxx\" (UniqueName: \"kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.663198 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.663329 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.665468 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.665597 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.664745 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da6d037-70d9-48e5-9dd6-8c205f784235" path="/var/lib/kubelet/pods/6da6d037-70d9-48e5-9dd6-8c205f784235/volumes" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.666590 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zh2"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.708224 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.709863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.755478 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771636 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771693 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771796 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6vxx\" (UniqueName: \"kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771821 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.771842 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.776565 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.779717 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.780364 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.796582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.799092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.814773 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2hbc5"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.816582 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.818185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6vxx\" (UniqueName: \"kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx\") pod \"keystone-bootstrap-f6zh2\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.822243 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-44klq" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.822539 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.823296 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.833524 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2hbc5"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.860744 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.863170 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.878118 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.878387 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.879022 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.879150 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.879249 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzc87\" (UniqueName: \"kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.879352 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.879483 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.887656 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.888403 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.939960 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-jjb4h"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.941366 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.952234 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.952860 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.953026 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.953150 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tf27p" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.973768 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-jjb4h"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983462 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzc87\" (UniqueName: \"kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983541 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983603 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983623 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.983677 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk8tx\" (UniqueName: \"kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.989969 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.989983 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.994722 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dz8np"] Mar 19 19:15:18 crc kubenswrapper[5033]: I0319 19:15:18.995985 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.006524 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xzrm7" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.006696 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014594 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg8bj\" (UniqueName: \"kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014881 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014900 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.014952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.015800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.016528 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.057722 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzc87\" (UniqueName: \"kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87\") pod \"dnsmasq-dns-784f69c749-nldzw\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.058439 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.099529 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dz8np"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlms6\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115801 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115851 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg8bj\" (UniqueName: \"kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115882 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115907 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115940 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115972 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.115991 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116045 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116093 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116108 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk8tx\" (UniqueName: \"kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116126 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116141 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khkr4\" (UniqueName: \"kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.116155 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.117648 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.119380 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.120000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.122221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.128076 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.136952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.143265 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.156180 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.161334 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg8bj\" (UniqueName: \"kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.161376 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.161503 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.163669 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vq5tm"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.163817 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk8tx\" (UniqueName: \"kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx\") pod \"ceilometer-0\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.165487 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.167373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data\") pod \"cinder-db-sync-2hbc5\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.172546 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s77d9" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.172826 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.173149 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.193166 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.195236 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vq5tm"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.203648 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.218367 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221127 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221200 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221284 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khkr4\" (UniqueName: \"kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221354 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221377 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlms6\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221554 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznnp\" (UniqueName: \"kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.221659 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.229090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.233846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.242934 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.248983 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.250120 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khkr4\" (UniqueName: \"kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4\") pod \"barbican-db-sync-dz8np\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.252056 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.252132 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.265068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlms6\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6\") pod \"cloudkitty-db-sync-jjb4h\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.270955 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9vptm"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.285936 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.290147 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.290583 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9vptm"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.303153 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6r6gh"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.304467 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.310527 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.310740 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.310963 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tgllq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.311809 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6r6gh"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.320153 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9vptm"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.327433 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.327566 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.327621 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.327704 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppzh\" (UniqueName: \"kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.327743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.332254 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.333941 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznnp\" (UniqueName: \"kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.334003 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.334061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: E0319 19:15:19.335547 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-4vqxv ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-847c4cc679-9vptm" podUID="4f82d780-c47f-4215-b61c-ac4a47d8304a" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.341141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.341352 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.343040 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.345147 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.368115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznnp\" (UniqueName: \"kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp\") pod \"neutron-db-sync-vq5tm\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.395336 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.401488 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.419635 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.433306 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.438194 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.438861 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.438920 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.438943 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439048 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439077 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqxv\" (UniqueName: \"kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439181 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439271 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439307 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppzh\" (UniqueName: \"kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.439345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.441507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.444871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.444991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.446212 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.457737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppzh\" (UniqueName: \"kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh\") pod \"placement-db-sync-6r6gh\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.509157 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545585 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545630 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44mj\" (UniqueName: \"kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545658 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545709 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqxv\" (UniqueName: \"kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545797 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545870 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.545887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.547743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.548753 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.549049 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.549935 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.550985 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.600271 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqxv\" (UniqueName: \"kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv\") pod \"dnsmasq-dns-847c4cc679-9vptm\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.626047 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zh2"] Mar 19 19:15:19 crc kubenswrapper[5033]: W0319 19:15:19.638788 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae1b07c_506f_41c0_958a_927fa931c8ae.slice/crio-07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47 WatchSource:0}: Error finding container 07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47: Status 404 returned error can't find the container with id 07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47 Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.648817 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.648886 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.650459 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.650503 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.650792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.650843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.650872 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651077 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651220 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44mj\" (UniqueName: \"kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651266 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651388 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651416 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651434 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651464 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config" (OuterVolumeSpecName: "config") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651532 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651608 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651710 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.651729 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.652086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.652352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.652799 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.653106 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.654992 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.676080 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.678610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44mj\" (UniqueName: \"kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj\") pod \"dnsmasq-dns-785d8bcb8c-bbrgq\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.684924 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.686824 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.691757 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.694319 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.694539 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xnfth" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.694481 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.700947 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.701977 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.756860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vqxv\" (UniqueName: \"kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv\") pod \"4f82d780-c47f-4215-b61c-ac4a47d8304a\" (UID: \"4f82d780-c47f-4215-b61c-ac4a47d8304a\") " Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.757954 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.757971 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f82d780-c47f-4215-b61c-ac4a47d8304a-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.763299 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv" (OuterVolumeSpecName: "kube-api-access-4vqxv") pod "4f82d780-c47f-4215-b61c-ac4a47d8304a" (UID: "4f82d780-c47f-4215-b61c-ac4a47d8304a"). InnerVolumeSpecName "kube-api-access-4vqxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.826218 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.847044 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.848886 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: W0319 19:15:19.849605 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a0cdec_49e6_4dce_81e0_f27260ee2f7f.slice/crio-e4fea98523efaf18f1908504c1666974933761fd0a52bcb135e61afbda1d58f6 WatchSource:0}: Error finding container e4fea98523efaf18f1908504c1666974933761fd0a52bcb135e61afbda1d58f6: Status 404 returned error can't find the container with id e4fea98523efaf18f1908504c1666974933761fd0a52bcb135e61afbda1d58f6 Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.857981 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.858051 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859173 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859202 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859222 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g55s\" (UniqueName: \"kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859310 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859383 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.859469 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vqxv\" (UniqueName: \"kubernetes.io/projected/4f82d780-c47f-4215-b61c-ac4a47d8304a-kube-api-access-4vqxv\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.874659 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.895492 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967545 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967615 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967664 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967704 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967763 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g55s\" (UniqueName: \"kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967865 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967888 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967910 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.967992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.968013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.968033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.968059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.968124 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.973402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.974009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.984216 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:19 crc kubenswrapper[5033]: I0319 19:15:19.984277 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ced0ba0235b23bf9966e6eaf15351f0da0b1c8b6be463535cbc385832171ea67/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.006280 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.006811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.007121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.007827 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.008684 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g55s\" (UniqueName: \"kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.063012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069354 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069500 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069525 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069644 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.069974 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.070026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.075966 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.076002 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e28e119150301283e822b5e6f5ae17495e606ad12e9d94c8819dbad244f5550/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.082345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.087466 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.089117 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.093791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.101322 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.108252 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.207979 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.208839 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.257007 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2hbc5"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.445645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hbc5" event={"ID":"715f11c5-fa05-4d4a-9d52-a21479ced465","Type":"ContainerStarted","Data":"64263f8aa7a503233e7368535ddba622457471801e480c084a4429bbc98b8a89"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.446922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerStarted","Data":"c7d90ab04bab566f5ba74b8495705aafa7542f99d6248c90cd252a4dbb3b4358"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.448070 5033 generic.go:334] "Generic (PLEG): container finished" podID="91a0cdec-49e6-4dce-81e0-f27260ee2f7f" containerID="f099b519741a8ca894659c6d8e8ac847b62011fa6740c63d963e2f466ca1aec8" exitCode=0 Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.448120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-nldzw" event={"ID":"91a0cdec-49e6-4dce-81e0-f27260ee2f7f","Type":"ContainerDied","Data":"f099b519741a8ca894659c6d8e8ac847b62011fa6740c63d963e2f466ca1aec8"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.448135 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-nldzw" event={"ID":"91a0cdec-49e6-4dce-81e0-f27260ee2f7f","Type":"ContainerStarted","Data":"e4fea98523efaf18f1908504c1666974933761fd0a52bcb135e61afbda1d58f6"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.454085 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.498681 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-9vptm" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.499236 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="dnsmasq-dns" containerID="cri-o://2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8" gracePeriod=10 Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.500922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zh2" event={"ID":"cae1b07c-506f-41c0-958a-927fa931c8ae","Type":"ContainerStarted","Data":"0016d8f6cd890d8cd54b29cc4d36180c232483682614ac7645d93dfb9decc896"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.500955 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zh2" event={"ID":"cae1b07c-506f-41c0-958a-927fa931c8ae","Type":"ContainerStarted","Data":"07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47"} Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.547623 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dz8np"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.569132 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-jjb4h"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.585510 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f6zh2" podStartSLOduration=2.578444867 podStartE2EDuration="2.578444867s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:20.561763798 +0000 UTC m=+1130.666793647" watchObservedRunningTime="2026-03-19 19:15:20.578444867 +0000 UTC m=+1130.683474716" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.710136 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.710176 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.710331 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.748999 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.776289 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9vptm"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.795097 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-9vptm"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.881474 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vq5tm"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.900566 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:15:20 crc kubenswrapper[5033]: I0319 19:15:20.915506 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6r6gh"] Mar 19 19:15:20 crc kubenswrapper[5033]: W0319 19:15:20.935640 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39e9fb01_0fe4_4bf2_9387_5919f16fb3ea.slice/crio-b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b WatchSource:0}: Error finding container b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b: Status 404 returned error can't find the container with id b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.219249 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.411127 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.516461 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.571976 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-nldzw" event={"ID":"91a0cdec-49e6-4dce-81e0-f27260ee2f7f","Type":"ContainerDied","Data":"e4fea98523efaf18f1908504c1666974933761fd0a52bcb135e61afbda1d58f6"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.572045 5033 scope.go:117] "RemoveContainer" containerID="f099b519741a8ca894659c6d8e8ac847b62011fa6740c63d963e2f466ca1aec8" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.572113 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-nldzw" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb\") pod \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584482 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc\") pod \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584505 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb\") pod \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzc87\" (UniqueName: \"kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87\") pod \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vq5tm" event={"ID":"97cf8f72-4051-495d-970c-388cbd48a0bb","Type":"ContainerStarted","Data":"5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vq5tm" event={"ID":"97cf8f72-4051-495d-970c-388cbd48a0bb","Type":"ContainerStarted","Data":"959f1fbbe4079a51b840486b694688ad6f6313f30661cafb34bddf3deb893dc3"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.584900 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config\") pod \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\" (UID: \"91a0cdec-49e6-4dce-81e0-f27260ee2f7f\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.602595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87" (OuterVolumeSpecName: "kube-api-access-mzc87") pod "91a0cdec-49e6-4dce-81e0-f27260ee2f7f" (UID: "91a0cdec-49e6-4dce-81e0-f27260ee2f7f"). InnerVolumeSpecName "kube-api-access-mzc87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.607968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerStarted","Data":"da5ba0688f6f3193ad8d2d4f60a6ad5bec124793090f52f637ddead5b6954283"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.629971 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vq5tm" podStartSLOduration=3.629950341 podStartE2EDuration="3.629950341s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:21.608330233 +0000 UTC m=+1131.713360082" watchObservedRunningTime="2026-03-19 19:15:21.629950341 +0000 UTC m=+1131.734980190" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.666024 5033 generic.go:334] "Generic (PLEG): container finished" podID="4e46793f-868f-42f3-85f4-007b0047583b" containerID="2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8" exitCode=0 Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.666231 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" event={"ID":"4e46793f-868f-42f3-85f4-007b0047583b","Type":"ContainerDied","Data":"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.666370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" event={"ID":"4e46793f-868f-42f3-85f4-007b0047583b","Type":"ContainerDied","Data":"2ab9f5b8ad3be3764284c4495dff65a804e7551c4b7b9944110110ed9a0ab327"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.666392 5033 scope.go:117] "RemoveContainer" containerID="2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.666691 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-sbmvv" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.684031 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "91a0cdec-49e6-4dce-81e0-f27260ee2f7f" (UID: "91a0cdec-49e6-4dce-81e0-f27260ee2f7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.685933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config" (OuterVolumeSpecName: "config") pod "91a0cdec-49e6-4dce-81e0-f27260ee2f7f" (UID: "91a0cdec-49e6-4dce-81e0-f27260ee2f7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.686140 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.686249 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.686347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.686406 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.686597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx7cx\" (UniqueName: \"kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.692182 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.692219 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.692231 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzc87\" (UniqueName: \"kubernetes.io/projected/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-kube-api-access-mzc87\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.705686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dz8np" event={"ID":"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9","Type":"ContainerStarted","Data":"e5583c80aa841c6f8f51fc9d6cf7ec8178c0a0afcb2ff8755f8a07d26afd9eb7"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.709546 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91a0cdec-49e6-4dce-81e0-f27260ee2f7f" (UID: "91a0cdec-49e6-4dce-81e0-f27260ee2f7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.709737 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91a0cdec-49e6-4dce-81e0-f27260ee2f7f" (UID: "91a0cdec-49e6-4dce-81e0-f27260ee2f7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.718758 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.746017 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx" (OuterVolumeSpecName: "kube-api-access-lx7cx") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "kube-api-access-lx7cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.749773 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e2f8699-552e-41fa-9109-033ab974d401" containerID="1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9" exitCode=0 Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.750136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" event={"ID":"8e2f8699-552e-41fa-9109-033ab974d401","Type":"ContainerDied","Data":"1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.750179 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" event={"ID":"8e2f8699-552e-41fa-9109-033ab974d401","Type":"ContainerStarted","Data":"eba4d60e830e137277d107dfd290678587f630d6c400bee90286f068985f98bf"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.764816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6r6gh" event={"ID":"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea","Type":"ContainerStarted","Data":"b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.793115 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.804343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-jjb4h" event={"ID":"412949e0-343a-45ad-873c-ff40cecb82de","Type":"ContainerStarted","Data":"924b18f7e4d021a4a838120a69f156ac17723131c128d17703277fbd8485881c"} Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.808981 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.809518 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.810167 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a0cdec-49e6-4dce-81e0-f27260ee2f7f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.810303 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx7cx\" (UniqueName: \"kubernetes.io/projected/4e46793f-868f-42f3-85f4-007b0047583b-kube-api-access-lx7cx\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.821791 5033 scope.go:117] "RemoveContainer" containerID="603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf" Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.855813 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.965110 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:21 crc kubenswrapper[5033]: I0319 19:15:21.990022 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.011172 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.013031 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.013240 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") pod \"4e46793f-868f-42f3-85f4-007b0047583b\" (UID: \"4e46793f-868f-42f3-85f4-007b0047583b\") " Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.013766 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:22 crc kubenswrapper[5033]: W0319 19:15:22.013831 5033 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4e46793f-868f-42f3-85f4-007b0047583b/volumes/kubernetes.io~configmap/dns-svc Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.013844 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.015595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config" (OuterVolumeSpecName: "config") pod "4e46793f-868f-42f3-85f4-007b0047583b" (UID: "4e46793f-868f-42f3-85f4-007b0047583b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.120839 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.120877 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e46793f-868f-42f3-85f4-007b0047583b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.175760 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.230031 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-nldzw"] Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.285425 5033 scope.go:117] "RemoveContainer" containerID="2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8" Mar 19 19:15:22 crc kubenswrapper[5033]: E0319 19:15:22.286196 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8\": container with ID starting with 2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8 not found: ID does not exist" containerID="2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.286225 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8"} err="failed to get container status \"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8\": rpc error: code = NotFound desc = could not find container \"2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8\": container with ID starting with 2107f56b0d88f9c2af109eded347cf38bbf3011ae4212e3b2e7472ef19b40ff8 not found: ID does not exist" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.286246 5033 scope.go:117] "RemoveContainer" containerID="603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf" Mar 19 19:15:22 crc kubenswrapper[5033]: E0319 19:15:22.294392 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf\": container with ID starting with 603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf not found: ID does not exist" containerID="603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.294426 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf"} err="failed to get container status \"603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf\": rpc error: code = NotFound desc = could not find container \"603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf\": container with ID starting with 603ba60674978372f6c39cb0f0dd994fd81292fea7547980ce9700edbc157aaf not found: ID does not exist" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.319625 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.334119 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-sbmvv"] Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.670077 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e46793f-868f-42f3-85f4-007b0047583b" path="/var/lib/kubelet/pods/4e46793f-868f-42f3-85f4-007b0047583b/volumes" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.670905 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f82d780-c47f-4215-b61c-ac4a47d8304a" path="/var/lib/kubelet/pods/4f82d780-c47f-4215-b61c-ac4a47d8304a/volumes" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.671319 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a0cdec-49e6-4dce-81e0-f27260ee2f7f" path="/var/lib/kubelet/pods/91a0cdec-49e6-4dce-81e0-f27260ee2f7f/volumes" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.847728 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerStarted","Data":"1157683dd6587c92fe5fa117964985e1340037062164ebdd751453ec33da608f"} Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.869578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" event={"ID":"8e2f8699-552e-41fa-9109-033ab974d401","Type":"ContainerStarted","Data":"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12"} Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.869709 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.897320 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" podStartSLOduration=3.897303672 podStartE2EDuration="3.897303672s" podCreationTimestamp="2026-03-19 19:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:22.891434747 +0000 UTC m=+1132.996464596" watchObservedRunningTime="2026-03-19 19:15:22.897303672 +0000 UTC m=+1133.002333521" Mar 19 19:15:22 crc kubenswrapper[5033]: I0319 19:15:22.900805 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerStarted","Data":"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530"} Mar 19 19:15:23 crc kubenswrapper[5033]: I0319 19:15:23.929219 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerStarted","Data":"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740"} Mar 19 19:15:23 crc kubenswrapper[5033]: I0319 19:15:23.929313 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-log" containerID="cri-o://d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" gracePeriod=30 Mar 19 19:15:23 crc kubenswrapper[5033]: I0319 19:15:23.929688 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-httpd" containerID="cri-o://f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" gracePeriod=30 Mar 19 19:15:23 crc kubenswrapper[5033]: I0319 19:15:23.935560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerStarted","Data":"edc2be6630724610cb0208d7c603c07a3f63308b95ed243a1163ca583037e205"} Mar 19 19:15:23 crc kubenswrapper[5033]: I0319 19:15:23.951167 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.951136851 podStartE2EDuration="5.951136851s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:23.945610775 +0000 UTC m=+1134.050640624" watchObservedRunningTime="2026-03-19 19:15:23.951136851 +0000 UTC m=+1134.056166730" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.690437 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794175 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g55s\" (UniqueName: \"kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794534 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794594 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794622 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794663 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794692 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794758 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.794902 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"30c964f1-506a-4fea-8c31-24c9b11a6a48\" (UID: \"30c964f1-506a-4fea-8c31-24c9b11a6a48\") " Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.795906 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.796065 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs" (OuterVolumeSpecName: "logs") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.806611 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts" (OuterVolumeSpecName: "scripts") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.818759 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s" (OuterVolumeSpecName: "kube-api-access-8g55s") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "kube-api-access-8g55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.897211 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g55s\" (UniqueName: \"kubernetes.io/projected/30c964f1-506a-4fea-8c31-24c9b11a6a48-kube-api-access-8g55s\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.897249 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.897261 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.897272 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c964f1-506a-4fea-8c31-24c9b11a6a48-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.949650 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.965319 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data" (OuterVolumeSpecName: "config-data") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.966262 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854" (OuterVolumeSpecName: "glance") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.981184 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30c964f1-506a-4fea-8c31-24c9b11a6a48" (UID: "30c964f1-506a-4fea-8c31-24c9b11a6a48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.981253 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerStarted","Data":"ec310b9b8aefe18e9df390bec5cd1d135baf3e1cade3bdafcf013cb94391adc0"} Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.981426 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-log" containerID="cri-o://edc2be6630724610cb0208d7c603c07a3f63308b95ed243a1163ca583037e205" gracePeriod=30 Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.982001 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-httpd" containerID="cri-o://ec310b9b8aefe18e9df390bec5cd1d135baf3e1cade3bdafcf013cb94391adc0" gracePeriod=30 Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.986368 5033 generic.go:334] "Generic (PLEG): container finished" podID="cae1b07c-506f-41c0-958a-927fa931c8ae" containerID="0016d8f6cd890d8cd54b29cc4d36180c232483682614ac7645d93dfb9decc896" exitCode=0 Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.986469 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zh2" event={"ID":"cae1b07c-506f-41c0-958a-927fa931c8ae","Type":"ContainerDied","Data":"0016d8f6cd890d8cd54b29cc4d36180c232483682614ac7645d93dfb9decc896"} Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989409 5033 generic.go:334] "Generic (PLEG): container finished" podID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerID="f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" exitCode=143 Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989431 5033 generic.go:334] "Generic (PLEG): container finished" podID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerID="d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" exitCode=143 Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerDied","Data":"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740"} Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerDied","Data":"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530"} Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989501 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30c964f1-506a-4fea-8c31-24c9b11a6a48","Type":"ContainerDied","Data":"da5ba0688f6f3193ad8d2d4f60a6ad5bec124793090f52f637ddead5b6954283"} Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989519 5033 scope.go:117] "RemoveContainer" containerID="f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" Mar 19 19:15:24 crc kubenswrapper[5033]: I0319 19:15:24.989666 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.002045 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") on node \"crc\" " Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.002076 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.002092 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.002101 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c964f1-506a-4fea-8c31-24c9b11a6a48-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.020175 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.020157317 podStartE2EDuration="7.020157317s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:25.007925023 +0000 UTC m=+1135.112954872" watchObservedRunningTime="2026-03-19 19:15:25.020157317 +0000 UTC m=+1135.125187166" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.076576 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.077299 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854") on node "crc" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.105723 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.120173 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.136626 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.145897 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:25 crc kubenswrapper[5033]: E0319 19:15:25.146326 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-httpd" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146344 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-httpd" Mar 19 19:15:25 crc kubenswrapper[5033]: E0319 19:15:25.146357 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a0cdec-49e6-4dce-81e0-f27260ee2f7f" containerName="init" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146363 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a0cdec-49e6-4dce-81e0-f27260ee2f7f" containerName="init" Mar 19 19:15:25 crc kubenswrapper[5033]: E0319 19:15:25.146371 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="init" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146377 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="init" Mar 19 19:15:25 crc kubenswrapper[5033]: E0319 19:15:25.146396 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="dnsmasq-dns" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146402 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="dnsmasq-dns" Mar 19 19:15:25 crc kubenswrapper[5033]: E0319 19:15:25.146410 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-log" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146470 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-log" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146687 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e46793f-868f-42f3-85f4-007b0047583b" containerName="dnsmasq-dns" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146708 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-httpd" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146722 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a0cdec-49e6-4dce-81e0-f27260ee2f7f" containerName="init" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.146732 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" containerName="glance-log" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.148658 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.152088 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.152228 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.156515 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221325 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221377 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vx9\" (UniqueName: \"kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221905 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.221989 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323695 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vx9\" (UniqueName: \"kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323783 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323891 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.323959 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.324294 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.324629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.327413 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.327487 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ced0ba0235b23bf9966e6eaf15351f0da0b1c8b6be463535cbc385832171ea67/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.328105 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.329627 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.329908 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.339206 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.344406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vx9\" (UniqueName: \"kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.365932 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:15:25 crc kubenswrapper[5033]: I0319 19:15:25.468107 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:26 crc kubenswrapper[5033]: I0319 19:15:26.015037 5033 generic.go:334] "Generic (PLEG): container finished" podID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerID="ec310b9b8aefe18e9df390bec5cd1d135baf3e1cade3bdafcf013cb94391adc0" exitCode=0 Mar 19 19:15:26 crc kubenswrapper[5033]: I0319 19:15:26.015392 5033 generic.go:334] "Generic (PLEG): container finished" podID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerID="edc2be6630724610cb0208d7c603c07a3f63308b95ed243a1163ca583037e205" exitCode=143 Mar 19 19:15:26 crc kubenswrapper[5033]: I0319 19:15:26.015119 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerDied","Data":"ec310b9b8aefe18e9df390bec5cd1d135baf3e1cade3bdafcf013cb94391adc0"} Mar 19 19:15:26 crc kubenswrapper[5033]: I0319 19:15:26.015441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerDied","Data":"edc2be6630724610cb0208d7c603c07a3f63308b95ed243a1163ca583037e205"} Mar 19 19:15:26 crc kubenswrapper[5033]: I0319 19:15:26.634423 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c964f1-506a-4fea-8c31-24c9b11a6a48" path="/var/lib/kubelet/pods/30c964f1-506a-4fea-8c31-24c9b11a6a48/volumes" Mar 19 19:15:29 crc kubenswrapper[5033]: I0319 19:15:29.702617 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:15:29 crc kubenswrapper[5033]: I0319 19:15:29.766401 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:15:29 crc kubenswrapper[5033]: I0319 19:15:29.766948 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-8955p" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" containerID="cri-o://b8e80c1767e4282004a62d9483c41ed28cea7c1b161544b9ddd037cff26d538c" gracePeriod=10 Mar 19 19:15:30 crc kubenswrapper[5033]: I0319 19:15:30.083351 5033 generic.go:334] "Generic (PLEG): container finished" podID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerID="b8e80c1767e4282004a62d9483c41ed28cea7c1b161544b9ddd037cff26d538c" exitCode=0 Mar 19 19:15:30 crc kubenswrapper[5033]: I0319 19:15:30.083409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8955p" event={"ID":"e4998496-39f0-42d7-b0fa-e2caabe7ccad","Type":"ContainerDied","Data":"b8e80c1767e4282004a62d9483c41ed28cea7c1b161544b9ddd037cff26d538c"} Mar 19 19:15:31 crc kubenswrapper[5033]: I0319 19:15:31.611608 5033 scope.go:117] "RemoveContainer" containerID="1edfa7a204497bf4920bb9aec8872ccdac51c6edf2d1bd3167d3f9b69c18edee" Mar 19 19:15:31 crc kubenswrapper[5033]: I0319 19:15:31.842937 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-8955p" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.035844 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.110196 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zh2" event={"ID":"cae1b07c-506f-41c0-958a-927fa931c8ae","Type":"ContainerDied","Data":"07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47"} Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.110256 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07aea748f47bbd30b70415a2e84d8ab83feec0f7a6068db711a2aa67cd9a1c47" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.110283 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zh2" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.119868 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.119970 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6vxx\" (UniqueName: \"kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.120026 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.120111 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.120183 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.120264 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data\") pod \"cae1b07c-506f-41c0-958a-927fa931c8ae\" (UID: \"cae1b07c-506f-41c0-958a-927fa931c8ae\") " Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.128620 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx" (OuterVolumeSpecName: "kube-api-access-c6vxx") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "kube-api-access-c6vxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.135566 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts" (OuterVolumeSpecName: "scripts") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.135729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.149987 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.161697 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data" (OuterVolumeSpecName: "config-data") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.162550 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae1b07c-506f-41c0-958a-927fa931c8ae" (UID: "cae1b07c-506f-41c0-958a-927fa931c8ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224100 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224153 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6vxx\" (UniqueName: \"kubernetes.io/projected/cae1b07c-506f-41c0-958a-927fa931c8ae-kube-api-access-c6vxx\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224168 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224179 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224190 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:33 crc kubenswrapper[5033]: I0319 19:15:33.224200 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1b07c-506f-41c0-958a-927fa931c8ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.173599 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f6zh2"] Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.183559 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f6zh2"] Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.294535 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t5tfl"] Mar 19 19:15:34 crc kubenswrapper[5033]: E0319 19:15:34.295335 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae1b07c-506f-41c0-958a-927fa931c8ae" containerName="keystone-bootstrap" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.295381 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae1b07c-506f-41c0-958a-927fa931c8ae" containerName="keystone-bootstrap" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.296237 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae1b07c-506f-41c0-958a-927fa931c8ae" containerName="keystone-bootstrap" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.309699 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.309940 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t5tfl"] Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.313856 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.314236 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.314492 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.316056 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tm5hs" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.445786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.445874 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.446004 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jfl4\" (UniqueName: \"kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.446343 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.446468 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.446580 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548842 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548897 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.548952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jfl4\" (UniqueName: \"kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.554436 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.554909 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.555743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.558051 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.558776 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.567878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jfl4\" (UniqueName: \"kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4\") pod \"keystone-bootstrap-t5tfl\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.625396 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:34 crc kubenswrapper[5033]: I0319 19:15:34.638889 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae1b07c-506f-41c0-958a-927fa931c8ae" path="/var/lib/kubelet/pods/cae1b07c-506f-41c0-958a-927fa931c8ae/volumes" Mar 19 19:15:36 crc kubenswrapper[5033]: E0319 19:15:36.866099 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cf8f72_4051_495d_970c_388cbd48a0bb.slice/crio-5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cf8f72_4051_495d_970c_388cbd48a0bb.slice/crio-conmon-5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:15:36 crc kubenswrapper[5033]: E0319 19:15:36.978233 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 19 19:15:36 crc kubenswrapper[5033]: E0319 19:15:36.978836 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khkr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dz8np_openstack(8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:36 crc kubenswrapper[5033]: E0319 19:15:36.980665 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dz8np" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" Mar 19 19:15:37 crc kubenswrapper[5033]: I0319 19:15:37.147053 5033 generic.go:334] "Generic (PLEG): container finished" podID="97cf8f72-4051-495d-970c-388cbd48a0bb" containerID="5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4" exitCode=0 Mar 19 19:15:37 crc kubenswrapper[5033]: I0319 19:15:37.147148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vq5tm" event={"ID":"97cf8f72-4051-495d-970c-388cbd48a0bb","Type":"ContainerDied","Data":"5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4"} Mar 19 19:15:37 crc kubenswrapper[5033]: E0319 19:15:37.149034 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-dz8np" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" Mar 19 19:15:41 crc kubenswrapper[5033]: I0319 19:15:41.838532 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-8955p" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.033953 5033 scope.go:117] "RemoveContainer" containerID="d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.167728 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.225572 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5ca3320-22dc-4aae-ab80-07bd935c2465","Type":"ContainerDied","Data":"1157683dd6587c92fe5fa117964985e1340037062164ebdd751453ec33da608f"} Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.225608 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.295654 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.295888 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.295931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.295961 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296048 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296081 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296107 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296129 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle\") pod \"d5ca3320-22dc-4aae-ab80-07bd935c2465\" (UID: \"d5ca3320-22dc-4aae-ab80-07bd935c2465\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs" (OuterVolumeSpecName: "logs") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.296572 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.300813 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg" (OuterVolumeSpecName: "kube-api-access-7szmg") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "kube-api-access-7szmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.301328 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts" (OuterVolumeSpecName: "scripts") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.327170 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.331201 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c" (OuterVolumeSpecName: "glance") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.345522 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data" (OuterVolumeSpecName: "config-data") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.347428 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d5ca3320-22dc-4aae-ab80-07bd935c2465" (UID: "d5ca3320-22dc-4aae-ab80-07bd935c2465"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397776 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397818 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca3320-22dc-4aae-ab80-07bd935c2465-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397829 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397842 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397856 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/d5ca3320-22dc-4aae-ab80-07bd935c2465-kube-api-access-7szmg\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397895 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") on node \"crc\" " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397906 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.397915 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca3320-22dc-4aae-ab80-07bd935c2465-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.425488 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.425626 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c") on node "crc" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.499949 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.518972 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.527637 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.613368 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.642308 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.649607 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:44 crc kubenswrapper[5033]: E0319 19:15:44.649998 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-log" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650014 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-log" Mar 19 19:15:44 crc kubenswrapper[5033]: E0319 19:15:44.650025 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="init" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650031 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="init" Mar 19 19:15:44 crc kubenswrapper[5033]: E0319 19:15:44.650040 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650045 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" Mar 19 19:15:44 crc kubenswrapper[5033]: E0319 19:15:44.650057 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cf8f72-4051-495d-970c-388cbd48a0bb" containerName="neutron-db-sync" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650063 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cf8f72-4051-495d-970c-388cbd48a0bb" containerName="neutron-db-sync" Mar 19 19:15:44 crc kubenswrapper[5033]: E0319 19:15:44.650079 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-httpd" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650086 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-httpd" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650253 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-httpd" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650269 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650507 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cf8f72-4051-495d-970c-388cbd48a0bb" containerName="neutron-db-sync" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.650531 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" containerName="glance-log" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.651544 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.654193 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.676314 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.700503 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711134 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config\") pod \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznnp\" (UniqueName: \"kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp\") pod \"97cf8f72-4051-495d-970c-388cbd48a0bb\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711301 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2b6f\" (UniqueName: \"kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f\") pod \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711329 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb\") pod \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb\") pod \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711418 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc\") pod \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\" (UID: \"e4998496-39f0-42d7-b0fa-e2caabe7ccad\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711434 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config\") pod \"97cf8f72-4051-495d-970c-388cbd48a0bb\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.711519 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle\") pod \"97cf8f72-4051-495d-970c-388cbd48a0bb\" (UID: \"97cf8f72-4051-495d-970c-388cbd48a0bb\") " Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.718712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp" (OuterVolumeSpecName: "kube-api-access-zznnp") pod "97cf8f72-4051-495d-970c-388cbd48a0bb" (UID: "97cf8f72-4051-495d-970c-388cbd48a0bb"). InnerVolumeSpecName "kube-api-access-zznnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.722681 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f" (OuterVolumeSpecName: "kube-api-access-t2b6f") pod "e4998496-39f0-42d7-b0fa-e2caabe7ccad" (UID: "e4998496-39f0-42d7-b0fa-e2caabe7ccad"). InnerVolumeSpecName "kube-api-access-t2b6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.742136 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97cf8f72-4051-495d-970c-388cbd48a0bb" (UID: "97cf8f72-4051-495d-970c-388cbd48a0bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.746186 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config" (OuterVolumeSpecName: "config") pod "97cf8f72-4051-495d-970c-388cbd48a0bb" (UID: "97cf8f72-4051-495d-970c-388cbd48a0bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.760570 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config" (OuterVolumeSpecName: "config") pod "e4998496-39f0-42d7-b0fa-e2caabe7ccad" (UID: "e4998496-39f0-42d7-b0fa-e2caabe7ccad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.763100 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4998496-39f0-42d7-b0fa-e2caabe7ccad" (UID: "e4998496-39f0-42d7-b0fa-e2caabe7ccad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.764817 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4998496-39f0-42d7-b0fa-e2caabe7ccad" (UID: "e4998496-39f0-42d7-b0fa-e2caabe7ccad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.768735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4998496-39f0-42d7-b0fa-e2caabe7ccad" (UID: "e4998496-39f0-42d7-b0fa-e2caabe7ccad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813322 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsxv\" (UniqueName: \"kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813395 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813500 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813528 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813548 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813742 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813809 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2b6f\" (UniqueName: \"kubernetes.io/projected/e4998496-39f0-42d7-b0fa-e2caabe7ccad-kube-api-access-t2b6f\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813820 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813829 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813838 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813848 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813856 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cf8f72-4051-495d-970c-388cbd48a0bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813865 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4998496-39f0-42d7-b0fa-e2caabe7ccad-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.813873 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznnp\" (UniqueName: \"kubernetes.io/projected/97cf8f72-4051-495d-970c-388cbd48a0bb-kube-api-access-zznnp\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.915324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.915444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.915515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.915982 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916043 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsxv\" (UniqueName: \"kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916109 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916330 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.916476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.919075 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.919109 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e28e119150301283e822b5e6f5ae17495e606ad12e9d94c8819dbad244f5550/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.919613 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.920477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.930115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.930549 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.932765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsxv\" (UniqueName: \"kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.957306 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " pod="openstack/glance-default-external-api-0" Mar 19 19:15:44 crc kubenswrapper[5033]: I0319 19:15:44.976370 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.237255 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vq5tm" event={"ID":"97cf8f72-4051-495d-970c-388cbd48a0bb","Type":"ContainerDied","Data":"959f1fbbe4079a51b840486b694688ad6f6313f30661cafb34bddf3deb893dc3"} Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.237306 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959f1fbbe4079a51b840486b694688ad6f6313f30661cafb34bddf3deb893dc3" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.237307 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vq5tm" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.239687 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8955p" event={"ID":"e4998496-39f0-42d7-b0fa-e2caabe7ccad","Type":"ContainerDied","Data":"fbac80e442d9fc7ab400c6a5b2a27b0b03a6e11c9fe0d4ed03e30ba6e8520db4"} Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.239794 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8955p" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.280962 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.288458 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8955p"] Mar 19 19:15:45 crc kubenswrapper[5033]: E0319 19:15:45.635975 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 19 19:15:45 crc kubenswrapper[5033]: E0319 19:15:45.636318 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg8bj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2hbc5_openstack(715f11c5-fa05-4d4a-9d52-a21479ced465): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:45 crc kubenswrapper[5033]: E0319 19:15:45.637576 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2hbc5" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.812695 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.814882 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.827059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.938831 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.939868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.939917 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpcb\" (UniqueName: \"kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.940203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.940277 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.940333 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.940376 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.940546 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.951005 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s77d9" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.951307 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.951498 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.951646 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 19:15:45 crc kubenswrapper[5033]: I0319 19:15:45.955281 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042220 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042239 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042256 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skwn2\" (UniqueName: \"kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042304 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042334 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042377 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpcb\" (UniqueName: \"kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042407 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042489 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.042505 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.043834 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.043965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.043955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.044358 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.044798 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.063045 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpcb\" (UniqueName: \"kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb\") pod \"dnsmasq-dns-55f844cf75-jnr9g\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.144004 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.144075 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.144617 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skwn2\" (UniqueName: \"kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.144698 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.144800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.149232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.151577 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.152072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.152148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.161256 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.162035 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skwn2\" (UniqueName: \"kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2\") pod \"neutron-76f8fddc48-z2hb6\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.225204 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:15:46 crc kubenswrapper[5033]: E0319 19:15:46.253898 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2hbc5" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.271994 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.638683 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ca3320-22dc-4aae-ab80-07bd935c2465" path="/var/lib/kubelet/pods/d5ca3320-22dc-4aae-ab80-07bd935c2465/volumes" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.639576 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" path="/var/lib/kubelet/pods/e4998496-39f0-42d7-b0fa-e2caabe7ccad/volumes" Mar 19 19:15:46 crc kubenswrapper[5033]: I0319 19:15:46.839894 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-8955p" podUID="e4998496-39f0-42d7-b0fa-e2caabe7ccad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.785301 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.787035 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.789216 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.789369 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.806045 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888135 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888222 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcl9\" (UniqueName: \"kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888499 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888618 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.888710 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990766 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990840 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990888 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcl9\" (UniqueName: \"kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:47 crc kubenswrapper[5033]: I0319 19:15:47.990963 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.006441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.006675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.007351 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.008272 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.009677 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcl9\" (UniqueName: \"kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.015188 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.025923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config\") pod \"neutron-8646b64d4f-xtmzw\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.109237 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.505510 5033 scope.go:117] "RemoveContainer" containerID="f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" Mar 19 19:15:48 crc kubenswrapper[5033]: E0319 19:15:48.505816 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740\": container with ID starting with f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740 not found: ID does not exist" containerID="f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.505845 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740"} err="failed to get container status \"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740\": rpc error: code = NotFound desc = could not find container \"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740\": container with ID starting with f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740 not found: ID does not exist" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.505865 5033 scope.go:117] "RemoveContainer" containerID="d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" Mar 19 19:15:48 crc kubenswrapper[5033]: E0319 19:15:48.506055 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530\": container with ID starting with d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530 not found: ID does not exist" containerID="d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506076 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530"} err="failed to get container status \"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530\": rpc error: code = NotFound desc = could not find container \"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530\": container with ID starting with d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530 not found: ID does not exist" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506089 5033 scope.go:117] "RemoveContainer" containerID="f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506254 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740"} err="failed to get container status \"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740\": rpc error: code = NotFound desc = could not find container \"f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740\": container with ID starting with f5ee3539474d4057a688a82ab8df0abcd9f6a822b810822766a4cf1b3a5f8740 not found: ID does not exist" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506277 5033 scope.go:117] "RemoveContainer" containerID="d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506443 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530"} err="failed to get container status \"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530\": rpc error: code = NotFound desc = could not find container \"d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530\": container with ID starting with d753813c754562575bf076ac6c42123799b9a689ce05ed270eec5ac6be22c530 not found: ID does not exist" Mar 19 19:15:48 crc kubenswrapper[5033]: I0319 19:15:48.506553 5033 scope.go:117] "RemoveContainer" containerID="ec310b9b8aefe18e9df390bec5cd1d135baf3e1cade3bdafcf013cb94391adc0" Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.102883 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t5tfl"] Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.167558 5033 scope.go:117] "RemoveContainer" containerID="edc2be6630724610cb0208d7c603c07a3f63308b95ed243a1163ca583037e205" Mar 19 19:15:51 crc kubenswrapper[5033]: W0319 19:15:51.183881 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b4c780_bc31_4138_9aa2_3d47604bc88f.slice/crio-21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3 WatchSource:0}: Error finding container 21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3: Status 404 returned error can't find the container with id 21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3 Mar 19 19:15:51 crc kubenswrapper[5033]: E0319 19:15:51.191181 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 19 19:15:51 crc kubenswrapper[5033]: E0319 19:15:51.191218 5033 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 19 19:15:51 crc kubenswrapper[5033]: E0319 19:15:51.191352 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlms6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-jjb4h_openstack(412949e0-343a-45ad-873c-ff40cecb82de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:51 crc kubenswrapper[5033]: E0319 19:15:51.192678 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-jjb4h" podUID="412949e0-343a-45ad-873c-ff40cecb82de" Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.322401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerStarted","Data":"6ef793be4722adc9897ac5d585d60c3ddb3ad94bd2d25d8c77f8dac09950a1de"} Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.334741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t5tfl" event={"ID":"04b4c780-bc31-4138-9aa2-3d47604bc88f","Type":"ContainerStarted","Data":"21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3"} Mar 19 19:15:51 crc kubenswrapper[5033]: E0319 19:15:51.340485 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-jjb4h" podUID="412949e0-343a-45ad-873c-ff40cecb82de" Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.388671 5033 scope.go:117] "RemoveContainer" containerID="b8e80c1767e4282004a62d9483c41ed28cea7c1b161544b9ddd037cff26d538c" Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.466656 5033 scope.go:117] "RemoveContainer" containerID="8c988c140eb3a02b6ccf84af9eedffb61c75b3ffb51c419a64f861e34525755f" Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.664272 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:15:51 crc kubenswrapper[5033]: W0319 19:15:51.794109 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d9c45af_6417_419e_8527_4102a5c6d96f.slice/crio-cde937cd94acf9ff051e43b057e1700af31780f690403843abc5c439e3fa57b0 WatchSource:0}: Error finding container cde937cd94acf9ff051e43b057e1700af31780f690403843abc5c439e3fa57b0: Status 404 returned error can't find the container with id cde937cd94acf9ff051e43b057e1700af31780f690403843abc5c439e3fa57b0 Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.801876 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.814140 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:15:51 crc kubenswrapper[5033]: I0319 19:15:51.909583 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.372726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerStarted","Data":"43b6cf2576b94a1750263249bcfadb96a5fda6a6440e91e30e3fdeb466699612"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.373337 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.373353 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerStarted","Data":"a36446b46b023b1c81c31f698eef52d286ae4ecabe9d883ac720569f66d7cb6e"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.373385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerStarted","Data":"93c828d322c8a212d04c5536040b0a3973f99b4bd779522a0be8660e5f72c2c0"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.387967 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6r6gh" event={"ID":"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea","Type":"ContainerStarted","Data":"31f70b8f5a1a3014397d27ff5c6c0d76b418cb9d3bb3dc81f914aa5912fb671f"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.398197 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerStarted","Data":"23ed328b240314f21cb36dce1fe08d86bd057bbf640cd0d10843674a204d11b7"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.400145 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76f8fddc48-z2hb6" podStartSLOduration=7.400127412 podStartE2EDuration="7.400127412s" podCreationTimestamp="2026-03-19 19:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:52.387353453 +0000 UTC m=+1162.492383302" watchObservedRunningTime="2026-03-19 19:15:52.400127412 +0000 UTC m=+1162.505157261" Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.400904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t5tfl" event={"ID":"04b4c780-bc31-4138-9aa2-3d47604bc88f","Type":"ContainerStarted","Data":"d9e60797cf5580a8928dd2f31e8f8810eda9729f856428040c13ef15c8e0986f"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.415232 5033 generic.go:334] "Generic (PLEG): container finished" podID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerID="c842f7230729f2973b92567048a0ae34324df8c125b9ad5ddb9ea8d445ffe2c8" exitCode=0 Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.415313 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" event={"ID":"9d9c45af-6417-419e-8527-4102a5c6d96f","Type":"ContainerDied","Data":"c842f7230729f2973b92567048a0ae34324df8c125b9ad5ddb9ea8d445ffe2c8"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.415336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" event={"ID":"9d9c45af-6417-419e-8527-4102a5c6d96f","Type":"ContainerStarted","Data":"cde937cd94acf9ff051e43b057e1700af31780f690403843abc5c439e3fa57b0"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.420795 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6r6gh" podStartSLOduration=8.762299668 podStartE2EDuration="33.420772002s" podCreationTimestamp="2026-03-19 19:15:19 +0000 UTC" firstStartedPulling="2026-03-19 19:15:20.951616366 +0000 UTC m=+1131.056646215" lastFinishedPulling="2026-03-19 19:15:45.6100887 +0000 UTC m=+1155.715118549" observedRunningTime="2026-03-19 19:15:52.410859363 +0000 UTC m=+1162.515889212" watchObservedRunningTime="2026-03-19 19:15:52.420772002 +0000 UTC m=+1162.525801851" Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.430495 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerStarted","Data":"d2e833fd9040ecc087ea93fe6c36b7b5dfc6eb1ad50f9828ab30764b39640540"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.438695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerStarted","Data":"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.438750 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerStarted","Data":"ddb836bd9eafd11a4d5e8226055039e26d02b6674f1a590343e2f1342f039c1d"} Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.447945 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t5tfl" podStartSLOduration=18.447926355 podStartE2EDuration="18.447926355s" podCreationTimestamp="2026-03-19 19:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:52.429653692 +0000 UTC m=+1162.534683531" watchObservedRunningTime="2026-03-19 19:15:52.447926355 +0000 UTC m=+1162.552956204" Mar 19 19:15:52 crc kubenswrapper[5033]: I0319 19:15:52.451077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerStarted","Data":"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.461262 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dz8np" event={"ID":"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9","Type":"ContainerStarted","Data":"b5eef0a406f9d3c72679a385009cdc60fdc8bd7a1b40d7e8a71f1de0d0c2eec0"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.464896 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" event={"ID":"9d9c45af-6417-419e-8527-4102a5c6d96f","Type":"ContainerStarted","Data":"7e2f6c4b671377bf676feaf2c7688d19ba6b830a31ce89cf660153f2fc29e873"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.464988 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.467010 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerStarted","Data":"63efe028fc693f0085f904b7ee103e19c28c6f8ae305f652d0a11d65dc2237db"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.467041 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerStarted","Data":"674648697e22b3221647833afbe0ad84f56178bab693cda8b10a513986c21349"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.469260 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerStarted","Data":"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.469342 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.471708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerStarted","Data":"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006"} Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.480582 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dz8np" podStartSLOduration=4.009435827 podStartE2EDuration="35.480564989s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="2026-03-19 19:15:20.830889132 +0000 UTC m=+1130.935918981" lastFinishedPulling="2026-03-19 19:15:52.302018294 +0000 UTC m=+1162.407048143" observedRunningTime="2026-03-19 19:15:53.475876297 +0000 UTC m=+1163.580906146" watchObservedRunningTime="2026-03-19 19:15:53.480564989 +0000 UTC m=+1163.585594838" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.493915 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=28.493868453 podStartE2EDuration="28.493868453s" podCreationTimestamp="2026-03-19 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:53.49268732 +0000 UTC m=+1163.597717169" watchObservedRunningTime="2026-03-19 19:15:53.493868453 +0000 UTC m=+1163.598898302" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.507499 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" podStartSLOduration=8.507486076 podStartE2EDuration="8.507486076s" podCreationTimestamp="2026-03-19 19:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:53.506313833 +0000 UTC m=+1163.611343682" watchObservedRunningTime="2026-03-19 19:15:53.507486076 +0000 UTC m=+1163.612515925" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.532940 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.532919271 podStartE2EDuration="9.532919271s" podCreationTimestamp="2026-03-19 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:53.522792436 +0000 UTC m=+1163.627822295" watchObservedRunningTime="2026-03-19 19:15:53.532919271 +0000 UTC m=+1163.637949120" Mar 19 19:15:53 crc kubenswrapper[5033]: I0319 19:15:53.540636 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8646b64d4f-xtmzw" podStartSLOduration=6.540617037 podStartE2EDuration="6.540617037s" podCreationTimestamp="2026-03-19 19:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:53.540402451 +0000 UTC m=+1163.645432320" watchObservedRunningTime="2026-03-19 19:15:53.540617037 +0000 UTC m=+1163.645646886" Mar 19 19:15:54 crc kubenswrapper[5033]: I0319 19:15:54.483212 5033 generic.go:334] "Generic (PLEG): container finished" podID="39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" containerID="31f70b8f5a1a3014397d27ff5c6c0d76b418cb9d3bb3dc81f914aa5912fb671f" exitCode=0 Mar 19 19:15:54 crc kubenswrapper[5033]: I0319 19:15:54.483267 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6r6gh" event={"ID":"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea","Type":"ContainerDied","Data":"31f70b8f5a1a3014397d27ff5c6c0d76b418cb9d3bb3dc81f914aa5912fb671f"} Mar 19 19:15:54 crc kubenswrapper[5033]: I0319 19:15:54.486987 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerStarted","Data":"2a2dc8e913f5460efd475ada44bceb3723831ce6b4936c432387ab2432bafcb8"} Mar 19 19:15:54 crc kubenswrapper[5033]: I0319 19:15:54.977341 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:15:54 crc kubenswrapper[5033]: I0319 19:15:54.977644 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.021632 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.037750 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.469350 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.469584 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.469621 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.469631 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.512603 5033 generic.go:334] "Generic (PLEG): container finished" podID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" containerID="b5eef0a406f9d3c72679a385009cdc60fdc8bd7a1b40d7e8a71f1de0d0c2eec0" exitCode=0 Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.512648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dz8np" event={"ID":"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9","Type":"ContainerDied","Data":"b5eef0a406f9d3c72679a385009cdc60fdc8bd7a1b40d7e8a71f1de0d0c2eec0"} Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.515433 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t5tfl" event={"ID":"04b4c780-bc31-4138-9aa2-3d47604bc88f","Type":"ContainerDied","Data":"d9e60797cf5580a8928dd2f31e8f8810eda9729f856428040c13ef15c8e0986f"} Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.516061 5033 generic.go:334] "Generic (PLEG): container finished" podID="04b4c780-bc31-4138-9aa2-3d47604bc88f" containerID="d9e60797cf5580a8928dd2f31e8f8810eda9729f856428040c13ef15c8e0986f" exitCode=0 Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.517563 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.517595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.525034 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:55 crc kubenswrapper[5033]: I0319 19:15:55.528754 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:57 crc kubenswrapper[5033]: I0319 19:15:57.911771 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:57 crc kubenswrapper[5033]: I0319 19:15:57.932975 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:57 crc kubenswrapper[5033]: I0319 19:15:57.936668 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.031981 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs\") pod \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032067 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jfl4\" (UniqueName: \"kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032118 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle\") pod \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hppzh\" (UniqueName: \"kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh\") pod \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data\") pod \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032234 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032268 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts\") pod \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032338 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khkr4\" (UniqueName: \"kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4\") pod \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\" (UID: \"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032672 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data\") pod \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032697 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle\") pod \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\" (UID: \"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.032773 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys\") pod \"04b4c780-bc31-4138-9aa2-3d47604bc88f\" (UID: \"04b4c780-bc31-4138-9aa2-3d47604bc88f\") " Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.035356 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs" (OuterVolumeSpecName: "logs") pod "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" (UID: "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.037577 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts" (OuterVolumeSpecName: "scripts") pod "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" (UID: "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.038661 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.038817 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.039767 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4" (OuterVolumeSpecName: "kube-api-access-8jfl4") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "kube-api-access-8jfl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.042265 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts" (OuterVolumeSpecName: "scripts") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.046200 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4" (OuterVolumeSpecName: "kube-api-access-khkr4") pod "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" (UID: "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9"). InnerVolumeSpecName "kube-api-access-khkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.046803 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh" (OuterVolumeSpecName: "kube-api-access-hppzh") pod "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" (UID: "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea"). InnerVolumeSpecName "kube-api-access-hppzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.060576 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" (UID: "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.076791 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" (UID: "8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.081156 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.081994 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" (UID: "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.083313 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data" (OuterVolumeSpecName: "config-data") pod "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" (UID: "39e9fb01-0fe4-4bf2-9387-5919f16fb3ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.083745 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data" (OuterVolumeSpecName: "config-data") pod "04b4c780-bc31-4138-9aa2-3d47604bc88f" (UID: "04b4c780-bc31-4138-9aa2-3d47604bc88f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135048 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135094 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135107 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135118 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khkr4\" (UniqueName: \"kubernetes.io/projected/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-kube-api-access-khkr4\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135129 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135138 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135146 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135155 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135163 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04b4c780-bc31-4138-9aa2-3d47604bc88f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135171 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135179 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jfl4\" (UniqueName: \"kubernetes.io/projected/04b4c780-bc31-4138-9aa2-3d47604bc88f-kube-api-access-8jfl4\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135187 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135195 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hppzh\" (UniqueName: \"kubernetes.io/projected/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea-kube-api-access-hppzh\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.135204 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.547804 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dz8np" event={"ID":"8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9","Type":"ContainerDied","Data":"e5583c80aa841c6f8f51fc9d6cf7ec8178c0a0afcb2ff8755f8a07d26afd9eb7"} Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.548486 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5583c80aa841c6f8f51fc9d6cf7ec8178c0a0afcb2ff8755f8a07d26afd9eb7" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.547885 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dz8np" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.549592 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6r6gh" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.549570 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6r6gh" event={"ID":"39e9fb01-0fe4-4bf2-9387-5919f16fb3ea","Type":"ContainerDied","Data":"b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b"} Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.549750 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7a9576c423f636f365b2bed960a4328eb4cea02e8d4246be3b4cffeb37e0d3b" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.552711 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerStarted","Data":"eba221aef4fbf7142c38b2695a1c83d19c9029bf5ff14fcabd28006046d08c5c"} Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.554418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t5tfl" event={"ID":"04b4c780-bc31-4138-9aa2-3d47604bc88f","Type":"ContainerDied","Data":"21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3"} Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.554472 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21730fc8b7c2f9614b1c7fc24944a8e4d9cc5ebf5c55ace636d38f400bdfaaa3" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.554581 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t5tfl" Mar 19 19:15:58 crc kubenswrapper[5033]: I0319 19:15:58.698514 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.165360 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:15:59 crc kubenswrapper[5033]: E0319 19:15:59.166289 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" containerName="placement-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166302 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" containerName="placement-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: E0319 19:15:59.166333 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" containerName="barbican-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166341 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" containerName="barbican-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: E0319 19:15:59.166355 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b4c780-bc31-4138-9aa2-3d47604bc88f" containerName="keystone-bootstrap" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166361 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b4c780-bc31-4138-9aa2-3d47604bc88f" containerName="keystone-bootstrap" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166642 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b4c780-bc31-4138-9aa2-3d47604bc88f" containerName="keystone-bootstrap" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166661 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" containerName="placement-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.166680 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" containerName="barbican-db-sync" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.169013 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.171086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.171163 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.171188 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.171232 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.171266 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plld5\" (UniqueName: \"kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.175480 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.175708 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.176674 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xzrm7" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.228948 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.256047 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.257977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.264759 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klfjz\" (UniqueName: \"kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274280 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274366 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274481 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274556 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274661 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plld5\" (UniqueName: \"kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274788 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.274958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.275032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.275497 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.278747 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d44c58694-mkj7x"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.280128 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.291865 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.292566 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.292766 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.292874 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.293027 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tm5hs" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.293138 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.293248 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.299416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.310124 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d44c58694-mkj7x"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.314394 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.315137 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.321793 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.324300 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.334535 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.341651 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tgllq" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.341898 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.342068 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.342191 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.342185 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.347515 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plld5\" (UniqueName: \"kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5\") pod \"barbican-keystone-listener-69b7dcf4b4-9fq5c\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.363748 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.364077 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="dnsmasq-dns" containerID="cri-o://7e2f6c4b671377bf676feaf2c7688d19ba6b830a31ce89cf660153f2fc29e873" gracePeriod=10 Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.371275 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377390 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377442 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-config-data\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377477 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-internal-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377501 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-scripts\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377587 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377612 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377642 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22hc\" (UniqueName: \"kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377662 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klfjz\" (UniqueName: \"kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377719 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-combined-ca-bundle\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377767 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s92nz\" (UniqueName: \"kubernetes.io/projected/096a6f51-befa-462f-b029-3d4d84e884bf-kube-api-access-s92nz\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377785 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-fernet-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377829 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377850 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-public-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.377912 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-credential-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.378527 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.381608 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.391953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.393480 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.395386 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.397147 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.402394 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.421135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klfjz\" (UniqueName: \"kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz\") pod \"barbican-worker-756cc89c77-vzp6q\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486509 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-internal-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-config-data\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486580 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-scripts\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486644 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486761 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22hc\" (UniqueName: \"kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486783 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486848 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-combined-ca-bundle\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486901 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s92nz\" (UniqueName: \"kubernetes.io/projected/096a6f51-befa-462f-b029-3d4d84e884bf-kube-api-access-s92nz\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486923 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-fernet-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.486979 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.487003 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-public-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.487088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-credential-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.494147 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.505480 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-config-data\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.506329 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.506932 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-internal-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.511428 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.513819 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.514208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-credential-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.515051 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.515568 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-fernet-keys\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.515825 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-public-tls-certs\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.519001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-combined-ca-bundle\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.521973 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/096a6f51-befa-462f-b029-3d4d84e884bf-scripts\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.522175 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.533185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22hc\" (UniqueName: \"kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.533409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs\") pod \"placement-67c467f5b-st865\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.556205 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s92nz\" (UniqueName: \"kubernetes.io/projected/096a6f51-befa-462f-b029-3d4d84e884bf-kube-api-access-s92nz\") pod \"keystone-5d44c58694-mkj7x\" (UID: \"096a6f51-befa-462f-b029-3d4d84e884bf\") " pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.568254 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.592069 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.608321 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.618624 5033 generic.go:334] "Generic (PLEG): container finished" podID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerID="7e2f6c4b671377bf676feaf2c7688d19ba6b830a31ce89cf660153f2fc29e873" exitCode=0 Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621241 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621294 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621324 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621516 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nln2r\" (UniqueName: \"kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.621675 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.624199 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c467f5b-st865" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.625194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" event={"ID":"9d9c45af-6417-419e-8527-4102a5c6d96f","Type":"ContainerDied","Data":"7e2f6c4b671377bf676feaf2c7688d19ba6b830a31ce89cf660153f2fc29e873"} Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.625523 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.643397 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.665933 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7764867bbc-cjkpd"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.670724 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.695318 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.697265 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725373 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725464 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nln2r\" (UniqueName: \"kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725504 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725622 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725641 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.725661 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.728191 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.728836 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.729610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.730092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.737081 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.761550 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.764205 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nln2r\" (UniqueName: \"kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r\") pod \"dnsmasq-dns-85ff748b95-85psj\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.814785 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7764867bbc-cjkpd"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-combined-ca-bundle\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828603 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828630 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data-custom\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828674 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-logs\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828704 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4mn4\" (UniqueName: \"kubernetes.io/projected/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-kube-api-access-j4mn4\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828755 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828796 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-combined-ca-bundle\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828815 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828848 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data-custom\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828969 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-logs\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.828986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8st\" (UniqueName: \"kubernetes.io/projected/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-kube-api-access-wk8st\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.829049 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmfw8\" (UniqueName: \"kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.887254 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm"] Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936538 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936674 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-combined-ca-bundle\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936723 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936763 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data-custom\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.936952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-logs\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937143 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8st\" (UniqueName: \"kubernetes.io/projected/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-kube-api-access-wk8st\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937240 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmfw8\" (UniqueName: \"kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937433 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-combined-ca-bundle\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data-custom\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937678 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-logs\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.937756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4mn4\" (UniqueName: \"kubernetes.io/projected/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-kube-api-access-j4mn4\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.944744 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-logs\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.946497 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:15:59 crc kubenswrapper[5033]: I0319 19:15:59.959115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-logs\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.010019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.038504 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-combined-ca-bundle\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.039036 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data-custom\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.046841 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-config-data\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.054501 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-combined-ca-bundle\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.054556 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.055290 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.055403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data-custom\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.061328 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-66c5bf8b4d-dnhrv"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.063490 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.068148 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.070757 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmfw8\" (UniqueName: \"kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8\") pod \"barbican-api-6564b97bbb-lwnvf\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.071797 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8st\" (UniqueName: \"kubernetes.io/projected/cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8-kube-api-access-wk8st\") pod \"barbican-keystone-listener-6d4b6bf66b-qkrnm\" (UID: \"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8\") " pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.080792 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4mn4\" (UniqueName: \"kubernetes.io/projected/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-kube-api-access-j4mn4\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.081122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b-config-data\") pod \"barbican-worker-7764867bbc-cjkpd\" (UID: \"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b\") " pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.085666 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66c5bf8b4d-dnhrv"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.108667 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.122196 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.124014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.138293 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7764867bbc-cjkpd" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142154 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-public-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142218 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-combined-ca-bundle\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142307 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-scripts\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142355 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91a4632-26f5-4e3e-82d7-53b28bd561f5-logs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142420 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-config-data\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142524 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-internal-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.142556 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg6xk\" (UniqueName: \"kubernetes.io/projected/f91a4632-26f5-4e3e-82d7-53b28bd561f5-kube-api-access-tg6xk\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.151045 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.168797 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.186583 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565796-zfzhh"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.194977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.201783 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.201995 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.202116 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.218878 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-zfzhh"] Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.250918 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-internal-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.250973 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg6xk\" (UniqueName: \"kubernetes.io/projected/f91a4632-26f5-4e3e-82d7-53b28bd561f5-kube-api-access-tg6xk\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251015 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-public-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251159 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251203 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-combined-ca-bundle\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-scripts\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251365 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91a4632-26f5-4e3e-82d7-53b28bd561f5-logs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-config-data\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251428 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m589v\" (UniqueName: \"kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251462 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pqs\" (UniqueName: \"kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs\") pod \"auto-csr-approver-29565796-zfzhh\" (UID: \"6bf2ccd6-cf13-4015-974e-70e4fb20c374\") " pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251484 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.251560 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.256597 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-config-data\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.257340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91a4632-26f5-4e3e-82d7-53b28bd561f5-logs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.270389 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-internal-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.281273 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-public-tls-certs\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.292630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-scripts\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.293989 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg6xk\" (UniqueName: \"kubernetes.io/projected/f91a4632-26f5-4e3e-82d7-53b28bd561f5-kube-api-access-tg6xk\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.293995 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91a4632-26f5-4e3e-82d7-53b28bd561f5-combined-ca-bundle\") pod \"placement-66c5bf8b4d-dnhrv\" (UID: \"f91a4632-26f5-4e3e-82d7-53b28bd561f5\") " pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.335098 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375548 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m589v\" (UniqueName: \"kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pqs\" (UniqueName: \"kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs\") pod \"auto-csr-approver-29565796-zfzhh\" (UID: \"6bf2ccd6-cf13-4015-974e-70e4fb20c374\") " pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375669 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.375691 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.377736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.383967 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.396807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.397303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.404086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pqs\" (UniqueName: \"kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs\") pod \"auto-csr-approver-29565796-zfzhh\" (UID: \"6bf2ccd6-cf13-4015-974e-70e4fb20c374\") " pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.404148 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m589v\" (UniqueName: \"kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v\") pod \"barbican-api-55789975f4-8gdxc\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvpcb\" (UniqueName: \"kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476620 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476657 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.476784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.483024 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb" (OuterVolumeSpecName: "kube-api-access-nvpcb") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "kube-api-access-nvpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.498144 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.532667 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.542051 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.583843 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvpcb\" (UniqueName: \"kubernetes.io/projected/9d9c45af-6417-419e-8527-4102a5c6d96f-kube-api-access-nvpcb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.614571 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.667558 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.890519 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.900705 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.965736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.970844 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config" (OuterVolumeSpecName: "config") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:00 crc kubenswrapper[5033]: I0319 19:16:00.999740 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003238 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003351 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") pod \"9d9c45af-6417-419e-8527-4102a5c6d96f\" (UID: \"9d9c45af-6417-419e-8527-4102a5c6d96f\") " Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003818 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003837 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003847 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:01 crc kubenswrapper[5033]: W0319 19:16:01.003934 5033 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9d9c45af-6417-419e-8527-4102a5c6d96f/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.003945 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d9c45af-6417-419e-8527-4102a5c6d96f" (UID: "9d9c45af-6417-419e-8527-4102a5c6d96f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.105726 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d9c45af-6417-419e-8527-4102a5c6d96f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.122595 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jnr9g" event={"ID":"9d9c45af-6417-419e-8527-4102a5c6d96f","Type":"ContainerDied","Data":"cde937cd94acf9ff051e43b057e1700af31780f690403843abc5c439e3fa57b0"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123027 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123092 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d44c58694-mkj7x"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123164 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123236 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123333 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123399 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.123554 5033 scope.go:117] "RemoveContainer" containerID="7e2f6c4b671377bf676feaf2c7688d19ba6b830a31ce89cf660153f2fc29e873" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.253360 5033 scope.go:117] "RemoveContainer" containerID="c842f7230729f2973b92567048a0ae34324df8c125b9ad5ddb9ea8d445ffe2c8" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.403648 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.471058 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jnr9g"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.538315 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7764867bbc-cjkpd"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.560712 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.589538 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:16:01 crc kubenswrapper[5033]: W0319 19:16:01.603686 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03288c59_b7b1_47ef_9d63_4ac39bdf0b0b.slice/crio-2cb561e1c10583441523c71c7745ad96192bbf6d4fb6968c9c4a31cff35f40e0 WatchSource:0}: Error finding container 2cb561e1c10583441523c71c7745ad96192bbf6d4fb6968c9c4a31cff35f40e0: Status 404 returned error can't find the container with id 2cb561e1c10583441523c71c7745ad96192bbf6d4fb6968c9c4a31cff35f40e0 Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.651752 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.705482 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerStarted","Data":"2cb561e1c10583441523c71c7745ad96192bbf6d4fb6968c9c4a31cff35f40e0"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.725860 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerStarted","Data":"34867075780bbaf045d98bacc376ab27c1aaba47d48a7a6665940f139fc69046"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.749741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d44c58694-mkj7x" event={"ID":"096a6f51-befa-462f-b029-3d4d84e884bf","Type":"ContainerStarted","Data":"df9cc8a7d0f01dcb67fe438150f7c99e0aa4121937ae345b723750822a378663"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.749784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d44c58694-mkj7x" event={"ID":"096a6f51-befa-462f-b029-3d4d84e884bf","Type":"ContainerStarted","Data":"d2bc0825e71d4028fd4af7efb1e8bb82a5cda803880290d9118cafdd76719e13"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.751183 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.763084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-85psj" event={"ID":"8b94ddbe-3dfe-47ee-890c-94e4104ae543","Type":"ContainerStarted","Data":"eda70fe9239f9239335e36637858cfb2db71bc380e1f297ce8c8740269a23cb4"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.784752 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d44c58694-mkj7x" podStartSLOduration=2.784730164 podStartE2EDuration="2.784730164s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:01.773738424 +0000 UTC m=+1171.878768273" watchObservedRunningTime="2026-03-19 19:16:01.784730164 +0000 UTC m=+1171.889760013" Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.804359 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hbc5" event={"ID":"715f11c5-fa05-4d4a-9d52-a21479ced465","Type":"ContainerStarted","Data":"d7910b8137230053464c6be9e00567c6796c1fc6a1d97a916e8ff5cef9431627"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.812065 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66c5bf8b4d-dnhrv"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.815502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerStarted","Data":"6ad67dc2d5b6a566b22fab555867cb52b8de8bd78da5a329257ac91b54dfe4e2"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.816596 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7764867bbc-cjkpd" event={"ID":"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b","Type":"ContainerStarted","Data":"4e23cd4e826f79a22562bc91b8dee63fd529789ccb4b4dc71d7bb4e18f144a31"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.819198 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerStarted","Data":"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.819224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerStarted","Data":"afcba1637dbbf8a867c3e3043c25317cee621a868ba787956065d18890d27c39"} Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.829619 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-zfzhh"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.833410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" event={"ID":"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8","Type":"ContainerStarted","Data":"95b122cf7a6fa0ff39131720be874b6f3251039e82fbe3eadad939887aaaeae7"} Mar 19 19:16:01 crc kubenswrapper[5033]: W0319 19:16:01.834984 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf2ccd6_cf13_4015_974e_70e4fb20c374.slice/crio-0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037 WatchSource:0}: Error finding container 0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037: Status 404 returned error can't find the container with id 0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037 Mar 19 19:16:01 crc kubenswrapper[5033]: W0319 19:16:01.939290 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf91a4632_26f5_4e3e_82d7_53b28bd561f5.slice/crio-0c36236d2d0d27514804b537bba59877c4b91bbeadfbf447ed0847777d210f38 WatchSource:0}: Error finding container 0c36236d2d0d27514804b537bba59877c4b91bbeadfbf447ed0847777d210f38: Status 404 returned error can't find the container with id 0c36236d2d0d27514804b537bba59877c4b91bbeadfbf447ed0847777d210f38 Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.948510 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:01 crc kubenswrapper[5033]: I0319 19:16:01.949972 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2hbc5" podStartSLOduration=4.606056448 podStartE2EDuration="43.949912802s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="2026-03-19 19:15:20.261776907 +0000 UTC m=+1130.366806756" lastFinishedPulling="2026-03-19 19:15:59.605633261 +0000 UTC m=+1169.710663110" observedRunningTime="2026-03-19 19:16:01.845101856 +0000 UTC m=+1171.950131705" watchObservedRunningTime="2026-03-19 19:16:01.949912802 +0000 UTC m=+1172.054942651" Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.667749 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" path="/var/lib/kubelet/pods/9d9c45af-6417-419e-8527-4102a5c6d96f/volumes" Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.867698 5033 generic.go:334] "Generic (PLEG): container finished" podID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerID="439f81739944be76daeb0c3a1290a5a30ef7f5e30056d5a3cdf48b87a33bb253" exitCode=0 Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.867799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-85psj" event={"ID":"8b94ddbe-3dfe-47ee-890c-94e4104ae543","Type":"ContainerDied","Data":"439f81739944be76daeb0c3a1290a5a30ef7f5e30056d5a3cdf48b87a33bb253"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.883419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66c5bf8b4d-dnhrv" event={"ID":"f91a4632-26f5-4e3e-82d7-53b28bd561f5","Type":"ContainerStarted","Data":"ec4a1d8ae447caba5e69b2a008a9ce200d651e05743e1a94aba1fe0a0653c2d2"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.883477 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66c5bf8b4d-dnhrv" event={"ID":"f91a4632-26f5-4e3e-82d7-53b28bd561f5","Type":"ContainerStarted","Data":"0c36236d2d0d27514804b537bba59877c4b91bbeadfbf447ed0847777d210f38"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.888441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerStarted","Data":"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.926758 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerStarted","Data":"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.928395 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.928426 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.932722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" event={"ID":"6bf2ccd6-cf13-4015-974e-70e4fb20c374","Type":"ContainerStarted","Data":"0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.948611 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerStarted","Data":"ac8b8d96f1460d7635a0de5fb434b354388ea5918276cd652f61f7a37ebe2d7b"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.948666 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerStarted","Data":"a58e73ee4334e29ce6253801774c6a91ba1ecf59d39c25a8c64412655ce23c9d"} Mar 19 19:16:02 crc kubenswrapper[5033]: I0319 19:16:02.971763 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67c467f5b-st865" podStartSLOduration=3.971747088 podStartE2EDuration="3.971747088s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:02.966938392 +0000 UTC m=+1173.071968241" watchObservedRunningTime="2026-03-19 19:16:02.971747088 +0000 UTC m=+1173.076776937" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.379138 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.403734 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cc86df956-fh6cw"] Mar 19 19:16:03 crc kubenswrapper[5033]: E0319 19:16:03.404180 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="dnsmasq-dns" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.404201 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="dnsmasq-dns" Mar 19 19:16:03 crc kubenswrapper[5033]: E0319 19:16:03.404224 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="init" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.404231 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="init" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.404428 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9c45af-6417-419e-8527-4102a5c6d96f" containerName="dnsmasq-dns" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.450052 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc86df956-fh6cw"] Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.450208 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.456184 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.456506 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.604666 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4t9\" (UniqueName: \"kubernetes.io/projected/646984ab-e574-4cab-8933-8c5ba324f84c-kube-api-access-lk4t9\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.604996 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data-custom\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.605081 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-public-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.605116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.605164 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-internal-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.605184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-combined-ca-bundle\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.605213 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646984ab-e574-4cab-8933-8c5ba324f84c-logs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706540 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-internal-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706583 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-combined-ca-bundle\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646984ab-e574-4cab-8933-8c5ba324f84c-logs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4t9\" (UniqueName: \"kubernetes.io/projected/646984ab-e574-4cab-8933-8c5ba324f84c-kube-api-access-lk4t9\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706696 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data-custom\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706796 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-public-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.706845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.711227 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-public-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.711565 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646984ab-e574-4cab-8933-8c5ba324f84c-logs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.714410 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.715146 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-config-data-custom\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.715327 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-combined-ca-bundle\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.719636 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/646984ab-e574-4cab-8933-8c5ba324f84c-internal-tls-certs\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.734095 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4t9\" (UniqueName: \"kubernetes.io/projected/646984ab-e574-4cab-8933-8c5ba324f84c-kube-api-access-lk4t9\") pod \"barbican-api-7cc86df956-fh6cw\" (UID: \"646984ab-e574-4cab-8933-8c5ba324f84c\") " pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.774481 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.966063 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerStarted","Data":"a11af2d5e1932acae55e5361cd4152bc48ab90c5b4c9b15525f3b0a419524c17"} Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.966424 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55789975f4-8gdxc" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api-log" containerID="cri-o://ac8b8d96f1460d7635a0de5fb434b354388ea5918276cd652f61f7a37ebe2d7b" gracePeriod=30 Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.966770 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55789975f4-8gdxc" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api" containerID="cri-o://a11af2d5e1932acae55e5361cd4152bc48ab90c5b4c9b15525f3b0a419524c17" gracePeriod=30 Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.972592 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66c5bf8b4d-dnhrv" event={"ID":"f91a4632-26f5-4e3e-82d7-53b28bd561f5","Type":"ContainerStarted","Data":"6b7af688327731d18641ed9730ffb4ebb704f50233522608c5ef942f97a97b49"} Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.973659 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.973769 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.977562 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerStarted","Data":"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a"} Mar 19 19:16:03 crc kubenswrapper[5033]: I0319 19:16:03.997261 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55789975f4-8gdxc" podStartSLOduration=4.997242108 podStartE2EDuration="4.997242108s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:03.986987978 +0000 UTC m=+1174.092017827" watchObservedRunningTime="2026-03-19 19:16:03.997242108 +0000 UTC m=+1174.102271957" Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.010962 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6564b97bbb-lwnvf" podStartSLOduration=5.010945464 podStartE2EDuration="5.010945464s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:04.003865304 +0000 UTC m=+1174.108895153" watchObservedRunningTime="2026-03-19 19:16:04.010945464 +0000 UTC m=+1174.115975313" Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.029578 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-66c5bf8b4d-dnhrv" podStartSLOduration=5.029562089 podStartE2EDuration="5.029562089s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:04.028849109 +0000 UTC m=+1174.133878968" watchObservedRunningTime="2026-03-19 19:16:04.029562089 +0000 UTC m=+1174.134591928" Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.993811 5033 generic.go:334] "Generic (PLEG): container finished" podID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerID="a11af2d5e1932acae55e5361cd4152bc48ab90c5b4c9b15525f3b0a419524c17" exitCode=0 Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.994078 5033 generic.go:334] "Generic (PLEG): container finished" podID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerID="ac8b8d96f1460d7635a0de5fb434b354388ea5918276cd652f61f7a37ebe2d7b" exitCode=143 Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.993985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerDied","Data":"a11af2d5e1932acae55e5361cd4152bc48ab90c5b4c9b15525f3b0a419524c17"} Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.994249 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerDied","Data":"ac8b8d96f1460d7635a0de5fb434b354388ea5918276cd652f61f7a37ebe2d7b"} Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.994423 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:04 crc kubenswrapper[5033]: I0319 19:16:04.994460 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.204205 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc86df956-fh6cw"] Mar 19 19:16:05 crc kubenswrapper[5033]: W0319 19:16:05.223614 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod646984ab_e574_4cab_8933_8c5ba324f84c.slice/crio-87e9b86dacb1cdb48e5b42bb4d1f549cc6a2d05b0d67f6182c21a6d41c39f08f WatchSource:0}: Error finding container 87e9b86dacb1cdb48e5b42bb4d1f549cc6a2d05b0d67f6182c21a6d41c39f08f: Status 404 returned error can't find the container with id 87e9b86dacb1cdb48e5b42bb4d1f549cc6a2d05b0d67f6182c21a6d41c39f08f Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.226394 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.352523 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data\") pod \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.352811 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m589v\" (UniqueName: \"kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v\") pod \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.352913 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom\") pod \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.352948 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle\") pod \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.353024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs\") pod \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\" (UID: \"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d\") " Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.353581 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs" (OuterVolumeSpecName: "logs") pod "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" (UID: "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.364940 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v" (OuterVolumeSpecName: "kube-api-access-m589v") pod "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" (UID: "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d"). InnerVolumeSpecName "kube-api-access-m589v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.368273 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" (UID: "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.450106 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" (UID: "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.455165 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.455197 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.455207 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.455218 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m589v\" (UniqueName: \"kubernetes.io/projected/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-kube-api-access-m589v\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.478316 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data" (OuterVolumeSpecName: "config-data") pod "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" (UID: "ea18c1c7-31aa-4464-b21c-ee6faaa7d43d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:05 crc kubenswrapper[5033]: I0319 19:16:05.556964 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.007822 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc86df956-fh6cw" event={"ID":"646984ab-e574-4cab-8933-8c5ba324f84c","Type":"ContainerStarted","Data":"fdd1bd93a320e55076deb62f391c12c770228b9d69fa18ee1dce67e458b176bf"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.008039 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc86df956-fh6cw" event={"ID":"646984ab-e574-4cab-8933-8c5ba324f84c","Type":"ContainerStarted","Data":"87e9b86dacb1cdb48e5b42bb4d1f549cc6a2d05b0d67f6182c21a6d41c39f08f"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.009630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerStarted","Data":"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.011046 5033 generic.go:334] "Generic (PLEG): container finished" podID="6bf2ccd6-cf13-4015-974e-70e4fb20c374" containerID="da77633a261b3813e864d2c099c0f8e3f5cc6e23046d4b8a79f2c1fc0d415695" exitCode=0 Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.011086 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" event={"ID":"6bf2ccd6-cf13-4015-974e-70e4fb20c374","Type":"ContainerDied","Data":"da77633a261b3813e864d2c099c0f8e3f5cc6e23046d4b8a79f2c1fc0d415695"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.013366 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55789975f4-8gdxc" event={"ID":"ea18c1c7-31aa-4464-b21c-ee6faaa7d43d","Type":"ContainerDied","Data":"a58e73ee4334e29ce6253801774c6a91ba1ecf59d39c25a8c64412655ce23c9d"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.013402 5033 scope.go:117] "RemoveContainer" containerID="a11af2d5e1932acae55e5361cd4152bc48ab90c5b4c9b15525f3b0a419524c17" Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.013508 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55789975f4-8gdxc" Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.022796 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-85psj" event={"ID":"8b94ddbe-3dfe-47ee-890c-94e4104ae543","Type":"ContainerStarted","Data":"5f3dac1c6fccc58df7f700e74c0447de4bc0be8d3fff84099f90faf60cc6275e"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.023945 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.030871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" event={"ID":"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8","Type":"ContainerStarted","Data":"db72407520ac946cbd0264418a0ec10d55ecca283fdd95803a1aed87f14558c0"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.034926 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerStarted","Data":"9935be95c1eb332bcb24bb3e725730f00388cf48e552e7e0bd31e7471de5ad85"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.053169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7764867bbc-cjkpd" event={"ID":"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b","Type":"ContainerStarted","Data":"cb902adf3bc188b9cbff9b9e4fa03fe1d183e5be8c6c9165428204aef96676ab"} Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.082224 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-85psj" podStartSLOduration=7.082205404 podStartE2EDuration="7.082205404s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:06.052785515 +0000 UTC m=+1176.157815364" watchObservedRunningTime="2026-03-19 19:16:06.082205404 +0000 UTC m=+1176.187235253" Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.100438 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.112669 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55789975f4-8gdxc"] Mar 19 19:16:06 crc kubenswrapper[5033]: I0319 19:16:06.634748 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" path="/var/lib/kubelet/pods/ea18c1c7-31aa-4464-b21c-ee6faaa7d43d/volumes" Mar 19 19:16:07 crc kubenswrapper[5033]: I0319 19:16:07.062897 5033 generic.go:334] "Generic (PLEG): container finished" podID="715f11c5-fa05-4d4a-9d52-a21479ced465" containerID="d7910b8137230053464c6be9e00567c6796c1fc6a1d97a916e8ff5cef9431627" exitCode=0 Mar 19 19:16:07 crc kubenswrapper[5033]: I0319 19:16:07.063000 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hbc5" event={"ID":"715f11c5-fa05-4d4a-9d52-a21479ced465","Type":"ContainerDied","Data":"d7910b8137230053464c6be9e00567c6796c1fc6a1d97a916e8ff5cef9431627"} Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.082159 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" event={"ID":"6bf2ccd6-cf13-4015-974e-70e4fb20c374","Type":"ContainerDied","Data":"0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037"} Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.083063 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0812446c98bf2a98867301c4f068494e244a79a806fee3dc8b4ec5be283e2037" Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.131168 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.214258 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4pqs\" (UniqueName: \"kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs\") pod \"6bf2ccd6-cf13-4015-974e-70e4fb20c374\" (UID: \"6bf2ccd6-cf13-4015-974e-70e4fb20c374\") " Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.220310 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs" (OuterVolumeSpecName: "kube-api-access-z4pqs") pod "6bf2ccd6-cf13-4015-974e-70e4fb20c374" (UID: "6bf2ccd6-cf13-4015-974e-70e4fb20c374"). InnerVolumeSpecName "kube-api-access-z4pqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.317471 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4pqs\" (UniqueName: \"kubernetes.io/projected/6bf2ccd6-cf13-4015-974e-70e4fb20c374-kube-api-access-z4pqs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.774162 5033 scope.go:117] "RemoveContainer" containerID="ac8b8d96f1460d7635a0de5fb434b354388ea5918276cd652f61f7a37ebe2d7b" Mar 19 19:16:08 crc kubenswrapper[5033]: I0319 19:16:08.889327 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.031388 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.031484 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.031533 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.032352 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.032435 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.032481 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.032583 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg8bj\" (UniqueName: \"kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj\") pod \"715f11c5-fa05-4d4a-9d52-a21479ced465\" (UID: \"715f11c5-fa05-4d4a-9d52-a21479ced465\") " Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.033007 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715f11c5-fa05-4d4a-9d52-a21479ced465-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.037198 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts" (OuterVolumeSpecName: "scripts") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.040126 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj" (OuterVolumeSpecName: "kube-api-access-kg8bj") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "kube-api-access-kg8bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.040269 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.062295 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.103130 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data" (OuterVolumeSpecName: "config-data") pod "715f11c5-fa05-4d4a-9d52-a21479ced465" (UID: "715f11c5-fa05-4d4a-9d52-a21479ced465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.105233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hbc5" event={"ID":"715f11c5-fa05-4d4a-9d52-a21479ced465","Type":"ContainerDied","Data":"64263f8aa7a503233e7368535ddba622457471801e480c084a4429bbc98b8a89"} Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.105285 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64263f8aa7a503233e7368535ddba622457471801e480c084a4429bbc98b8a89" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.105364 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hbc5" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.119556 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7764867bbc-cjkpd" event={"ID":"dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b","Type":"ContainerStarted","Data":"2e41312eda5a10f0df3040d3adf36f1e81f5ad648f84215703a539641ff9e535"} Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.121498 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-zfzhh" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.138213 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.138252 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.138265 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.138277 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg8bj\" (UniqueName: \"kubernetes.io/projected/715f11c5-fa05-4d4a-9d52-a21479ced465-kube-api-access-kg8bj\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.138289 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/715f11c5-fa05-4d4a-9d52-a21479ced465-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.155947 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7764867bbc-cjkpd" podStartSLOduration=7.073719746 podStartE2EDuration="10.155930836s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="2026-03-19 19:16:01.631994416 +0000 UTC m=+1171.737024265" lastFinishedPulling="2026-03-19 19:16:04.714205506 +0000 UTC m=+1174.819235355" observedRunningTime="2026-03-19 19:16:09.151586924 +0000 UTC m=+1179.256616773" watchObservedRunningTime="2026-03-19 19:16:09.155930836 +0000 UTC m=+1179.260960675" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.223186 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.240089 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-tvvt6"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.252159 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-tvvt6"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.287347 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:09 crc kubenswrapper[5033]: E0319 19:16:09.287812 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" containerName="cinder-db-sync" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.287824 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" containerName="cinder-db-sync" Mar 19 19:16:09 crc kubenswrapper[5033]: E0319 19:16:09.287833 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288240 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api" Mar 19 19:16:09 crc kubenswrapper[5033]: E0319 19:16:09.288258 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf2ccd6-cf13-4015-974e-70e4fb20c374" containerName="oc" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288266 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf2ccd6-cf13-4015-974e-70e4fb20c374" containerName="oc" Mar 19 19:16:09 crc kubenswrapper[5033]: E0319 19:16:09.288295 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api-log" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288301 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api-log" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288533 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" containerName="cinder-db-sync" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288552 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api-log" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288566 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea18c1c7-31aa-4464-b21c-ee6faaa7d43d" containerName="barbican-api" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.288578 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf2ccd6-cf13-4015-974e-70e4fb20c374" containerName="oc" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.289655 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.296438 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.296971 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.297143 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-44klq" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.305156 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.306869 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.415014 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.415238 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-85psj" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="dnsmasq-dns" containerID="cri-o://5f3dac1c6fccc58df7f700e74c0447de4bc0be8d3fff84099f90faf60cc6275e" gracePeriod=10 Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.419598 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445091 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445144 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445165 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445248 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445301 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzppl\" (UniqueName: \"kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.445331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.454521 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.457010 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.484733 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547097 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547160 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547189 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547251 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547292 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547317 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547383 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbk96\" (UniqueName: \"kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzppl\" (UniqueName: \"kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.547440 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.549245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.553495 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.555359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.558310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.558937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.586141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzppl\" (UniqueName: \"kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl\") pod \"cinder-scheduler-0\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.603671 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.610511 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.614725 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.624776 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.627613 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.649793 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.650937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.651259 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.652050 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.652185 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.652354 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.651997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.659465 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbk96\" (UniqueName: \"kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.654011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.657342 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.655594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.680524 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbk96\" (UniqueName: \"kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96\") pod \"dnsmasq-dns-5c9776ccc5-42kx2\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761274 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761491 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfdp\" (UniqueName: \"kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761785 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.761869 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.762002 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.798939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.863900 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.863983 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864111 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfdp\" (UniqueName: \"kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864262 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.864897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.869141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.869255 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.869679 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.871720 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.895398 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfdp\" (UniqueName: \"kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp\") pod \"cinder-api-0\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " pod="openstack/cinder-api-0" Mar 19 19:16:09 crc kubenswrapper[5033]: I0319 19:16:09.998931 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.013717 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-85psj" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: connect: connection refused" Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.182168 5033 generic.go:334] "Generic (PLEG): container finished" podID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerID="5f3dac1c6fccc58df7f700e74c0447de4bc0be8d3fff84099f90faf60cc6275e" exitCode=0 Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.182427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-85psj" event={"ID":"8b94ddbe-3dfe-47ee-890c-94e4104ae543","Type":"ContainerDied","Data":"5f3dac1c6fccc58df7f700e74c0447de4bc0be8d3fff84099f90faf60cc6275e"} Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.597350 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.688925 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.689100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.689195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.689293 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nln2r\" (UniqueName: \"kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.689355 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.689409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb\") pod \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\" (UID: \"8b94ddbe-3dfe-47ee-890c-94e4104ae543\") " Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.766635 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r" (OuterVolumeSpecName: "kube-api-access-nln2r") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "kube-api-access-nln2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.771305 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee59143-3db6-4ba6-95d0-2017f1fa65e0" path="/var/lib/kubelet/pods/eee59143-3db6-4ba6-95d0-2017f1fa65e0/volumes" Mar 19 19:16:10 crc kubenswrapper[5033]: I0319 19:16:10.798978 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nln2r\" (UniqueName: \"kubernetes.io/projected/8b94ddbe-3dfe-47ee-890c-94e4104ae543-kube-api-access-nln2r\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.055556 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:11 crc kubenswrapper[5033]: W0319 19:16:11.071688 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fb297fd_2058_4055_89f9_d74164243306.slice/crio-70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6 WatchSource:0}: Error finding container 70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6: Status 404 returned error can't find the container with id 70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6 Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.245316 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" event={"ID":"6fb297fd-2058-4055-89f9-d74164243306","Type":"ContainerStarted","Data":"70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6"} Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.246696 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerStarted","Data":"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5"} Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.255471 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-85psj" event={"ID":"8b94ddbe-3dfe-47ee-890c-94e4104ae543","Type":"ContainerDied","Data":"eda70fe9239f9239335e36637858cfb2db71bc380e1f297ce8c8740269a23cb4"} Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.255526 5033 scope.go:117] "RemoveContainer" containerID="5f3dac1c6fccc58df7f700e74c0447de4bc0be8d3fff84099f90faf60cc6275e" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.255677 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-85psj" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.256028 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.285431 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" podStartSLOduration=8.551824129 podStartE2EDuration="12.285408518s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="2026-03-19 19:16:00.979480775 +0000 UTC m=+1171.084510624" lastFinishedPulling="2026-03-19 19:16:04.713065164 +0000 UTC m=+1174.818095013" observedRunningTime="2026-03-19 19:16:11.270021814 +0000 UTC m=+1181.375051653" watchObservedRunningTime="2026-03-19 19:16:11.285408518 +0000 UTC m=+1181.390438367" Mar 19 19:16:11 crc kubenswrapper[5033]: W0319 19:16:11.326506 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3363e19c_9790_4d10_8f14_6278c9b548d7.slice/crio-50ca7aa928853fbbb19c4da54afeceaa9417298e51d2b22e5a13f3f6052b4a1a WatchSource:0}: Error finding container 50ca7aa928853fbbb19c4da54afeceaa9417298e51d2b22e5a13f3f6052b4a1a: Status 404 returned error can't find the container with id 50ca7aa928853fbbb19c4da54afeceaa9417298e51d2b22e5a13f3f6052b4a1a Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.425742 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.636344 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.646207 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.734046 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.740885 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.748126 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config" (OuterVolumeSpecName: "config") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.749503 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.749517 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.749526 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.768882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b94ddbe-3dfe-47ee-890c-94e4104ae543" (UID: "8b94ddbe-3dfe-47ee-890c-94e4104ae543"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.852548 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b94ddbe-3dfe-47ee-890c-94e4104ae543-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:11 crc kubenswrapper[5033]: I0319 19:16:11.876912 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.006427 5033 scope.go:117] "RemoveContainer" containerID="439f81739944be76daeb0c3a1290a5a30ef7f5e30056d5a3cdf48b87a33bb253" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.044155 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.065230 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-85psj"] Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.276635 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerStarted","Data":"abb72c3d3128e2f126bf3c4d2750ab3bb9d0d23f9c0ea0924388962223817d45"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.316718 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-jjb4h" event={"ID":"412949e0-343a-45ad-873c-ff40cecb82de","Type":"ContainerStarted","Data":"cb2b84a659e89d69c2e18fd3512eda93885f551219cb65b36fdc9660beb9a96d"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.319132 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc86df956-fh6cw" event={"ID":"646984ab-e574-4cab-8933-8c5ba324f84c","Type":"ContainerStarted","Data":"1828bec2823794a785d6cbb02c5a126ee7a99f89774cc029c97f7ca5a854b4de"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.319678 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.320243 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343180 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerStarted","Data":"ae4c64225ea5cc856c19800509703a577265fa80c2f1e5adf1c261124a2b78d3"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343421 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-central-agent" containerID="cri-o://23ed328b240314f21cb36dce1fe08d86bd057bbf640cd0d10843674a204d11b7" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343721 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343769 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="proxy-httpd" containerID="cri-o://ae4c64225ea5cc856c19800509703a577265fa80c2f1e5adf1c261124a2b78d3" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343817 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="sg-core" containerID="cri-o://eba221aef4fbf7142c38b2695a1c83d19c9029bf5ff14fcabd28006046d08c5c" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.343891 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-notification-agent" containerID="cri-o://2a2dc8e913f5460efd475ada44bceb3723831ce6b4936c432387ab2432bafcb8" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.384778 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-jjb4h" podStartSLOduration=4.539578564 podStartE2EDuration="54.384758961s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="2026-03-19 19:15:20.882194964 +0000 UTC m=+1130.987224823" lastFinishedPulling="2026-03-19 19:16:10.727375371 +0000 UTC m=+1180.832405220" observedRunningTime="2026-03-19 19:16:12.340424381 +0000 UTC m=+1182.445454230" watchObservedRunningTime="2026-03-19 19:16:12.384758961 +0000 UTC m=+1182.489788810" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.397247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" event={"ID":"cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8","Type":"ContainerStarted","Data":"0a69783f094dddcdfa9a3a3194ab42f1d53fba2f0f2152f0b107dd346be84f89"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.397563 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cc86df956-fh6cw" podStartSLOduration=9.397544031 podStartE2EDuration="9.397544031s" podCreationTimestamp="2026-03-19 19:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:12.365988291 +0000 UTC m=+1182.471018140" watchObservedRunningTime="2026-03-19 19:16:12.397544031 +0000 UTC m=+1182.502573880" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.405711 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerStarted","Data":"50ca7aa928853fbbb19c4da54afeceaa9417298e51d2b22e5a13f3f6052b4a1a"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.414044 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.274392807 podStartE2EDuration="54.414025926s" podCreationTimestamp="2026-03-19 19:15:18 +0000 UTC" firstStartedPulling="2026-03-19 19:15:20.22313603 +0000 UTC m=+1130.328165879" lastFinishedPulling="2026-03-19 19:16:10.362769149 +0000 UTC m=+1180.467798998" observedRunningTime="2026-03-19 19:16:12.397051617 +0000 UTC m=+1182.502081466" watchObservedRunningTime="2026-03-19 19:16:12.414025926 +0000 UTC m=+1182.519055775" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.414803 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerStarted","Data":"e97b9904114bf1bd110bbeec608efc56279fee7a1162d07224267338bca12e65"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.414936 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-756cc89c77-vzp6q" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker-log" containerID="cri-o://9935be95c1eb332bcb24bb3e725730f00388cf48e552e7e0bd31e7471de5ad85" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.415204 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-756cc89c77-vzp6q" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker" containerID="cri-o://e97b9904114bf1bd110bbeec608efc56279fee7a1162d07224267338bca12e65" gracePeriod=30 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.432319 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6d4b6bf66b-qkrnm" podStartSLOduration=10.349149674 podStartE2EDuration="13.432301401s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="2026-03-19 19:16:01.631727429 +0000 UTC m=+1171.736757278" lastFinishedPulling="2026-03-19 19:16:04.714879156 +0000 UTC m=+1174.819909005" observedRunningTime="2026-03-19 19:16:12.426060615 +0000 UTC m=+1182.531090474" watchObservedRunningTime="2026-03-19 19:16:12.432301401 +0000 UTC m=+1182.537331250" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.445110 5033 generic.go:334] "Generic (PLEG): container finished" podID="6fb297fd-2058-4055-89f9-d74164243306" containerID="72c546d3784520ab3ebc46592b98c1c828d81af78c917c9460e53d162b0ba0f1" exitCode=0 Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.447568 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" event={"ID":"6fb297fd-2058-4055-89f9-d74164243306","Type":"ContainerDied","Data":"72c546d3784520ab3ebc46592b98c1c828d81af78c917c9460e53d162b0ba0f1"} Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.464912 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.466914 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-756cc89c77-vzp6q" podStartSLOduration=9.694972377 podStartE2EDuration="13.466894387s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="2026-03-19 19:16:00.941146674 +0000 UTC m=+1171.046176523" lastFinishedPulling="2026-03-19 19:16:04.713068684 +0000 UTC m=+1174.818098533" observedRunningTime="2026-03-19 19:16:12.460925269 +0000 UTC m=+1182.565955118" watchObservedRunningTime="2026-03-19 19:16:12.466894387 +0000 UTC m=+1182.571924236" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.670100 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" path="/var/lib/kubelet/pods/8b94ddbe-3dfe-47ee-890c-94e4104ae543/volumes" Mar 19 19:16:12 crc kubenswrapper[5033]: I0319 19:16:12.839748 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.128601 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.479565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" event={"ID":"6fb297fd-2058-4055-89f9-d74164243306","Type":"ContainerStarted","Data":"4758a0cb959bfe4c15076c9fc66a3e8f9836850dcfa189b72706b28bc8c2e228"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.479941 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504868 5033 generic.go:334] "Generic (PLEG): container finished" podID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerID="ae4c64225ea5cc856c19800509703a577265fa80c2f1e5adf1c261124a2b78d3" exitCode=0 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504924 5033 generic.go:334] "Generic (PLEG): container finished" podID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerID="eba221aef4fbf7142c38b2695a1c83d19c9029bf5ff14fcabd28006046d08c5c" exitCode=2 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504910 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerDied","Data":"ae4c64225ea5cc856c19800509703a577265fa80c2f1e5adf1c261124a2b78d3"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504976 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerDied","Data":"eba221aef4fbf7142c38b2695a1c83d19c9029bf5ff14fcabd28006046d08c5c"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504992 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerDied","Data":"2a2dc8e913f5460efd475ada44bceb3723831ce6b4936c432387ab2432bafcb8"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.504936 5033 generic.go:334] "Generic (PLEG): container finished" podID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerID="2a2dc8e913f5460efd475ada44bceb3723831ce6b4936c432387ab2432bafcb8" exitCode=0 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.505014 5033 generic.go:334] "Generic (PLEG): container finished" podID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerID="23ed328b240314f21cb36dce1fe08d86bd057bbf640cd0d10843674a204d11b7" exitCode=0 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.505118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerDied","Data":"23ed328b240314f21cb36dce1fe08d86bd057bbf640cd0d10843674a204d11b7"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.508249 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" podStartSLOduration=4.508220692 podStartE2EDuration="4.508220692s" podCreationTimestamp="2026-03-19 19:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:13.499139676 +0000 UTC m=+1183.604169525" watchObservedRunningTime="2026-03-19 19:16:13.508220692 +0000 UTC m=+1183.613250541" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.534204 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerStarted","Data":"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.537226 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerID="9935be95c1eb332bcb24bb3e725730f00388cf48e552e7e0bd31e7471de5ad85" exitCode=143 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.538651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerDied","Data":"9935be95c1eb332bcb24bb3e725730f00388cf48e552e7e0bd31e7471de5ad85"} Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.538694 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener-log" containerID="cri-o://3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" gracePeriod=30 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.538724 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener" containerID="cri-o://9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" gracePeriod=30 Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.879862 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946079 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946116 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk8tx\" (UniqueName: \"kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946148 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946191 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946271 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946412 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts\") pod \"a614bc99-b392-49dd-935b-0e59e44fb6f5\" (UID: \"a614bc99-b392-49dd-935b-0e59e44fb6f5\") " Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946793 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.946858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.947192 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.947212 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a614bc99-b392-49dd-935b-0e59e44fb6f5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.957863 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx" (OuterVolumeSpecName: "kube-api-access-fk8tx") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "kube-api-access-fk8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.981220 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts" (OuterVolumeSpecName: "scripts") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:13 crc kubenswrapper[5033]: I0319 19:16:13.998322 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.049808 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.049840 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.049850 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk8tx\" (UniqueName: \"kubernetes.io/projected/a614bc99-b392-49dd-935b-0e59e44fb6f5-kube-api-access-fk8tx\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.211887 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.239120 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data" (OuterVolumeSpecName: "config-data") pod "a614bc99-b392-49dd-935b-0e59e44fb6f5" (UID: "a614bc99-b392-49dd-935b-0e59e44fb6f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.262623 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.262710 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a614bc99-b392-49dd-935b-0e59e44fb6f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.473630 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.559230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerStarted","Data":"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.559280 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api-log" containerID="cri-o://9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" gracePeriod=30 Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.559389 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api" containerID="cri-o://97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" gracePeriod=30 Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.559550 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.566669 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plld5\" (UniqueName: \"kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5\") pod \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.566737 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs\") pod \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.566759 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle\") pod \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.566924 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data\") pod \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.566978 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom\") pod \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\" (UID: \"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9\") " Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.567366 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs" (OuterVolumeSpecName: "logs") pod "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" (UID: "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.576704 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" (UID: "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.576951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a614bc99-b392-49dd-935b-0e59e44fb6f5","Type":"ContainerDied","Data":"c7d90ab04bab566f5ba74b8495705aafa7542f99d6248c90cd252a4dbb3b4358"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.577007 5033 scope.go:117] "RemoveContainer" containerID="ae4c64225ea5cc856c19800509703a577265fa80c2f1e5adf1c261124a2b78d3" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.577169 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.579132 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5" (OuterVolumeSpecName: "kube-api-access-plld5") pod "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" (UID: "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9"). InnerVolumeSpecName "kube-api-access-plld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.594470 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.594422304 podStartE2EDuration="5.594422304s" podCreationTimestamp="2026-03-19 19:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:14.577092996 +0000 UTC m=+1184.682122855" watchObservedRunningTime="2026-03-19 19:16:14.594422304 +0000 UTC m=+1184.699452143" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599607 5033 generic.go:334] "Generic (PLEG): container finished" podID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerID="9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" exitCode=0 Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599638 5033 generic.go:334] "Generic (PLEG): container finished" podID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerID="3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" exitCode=143 Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599740 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerDied","Data":"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599802 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerDied","Data":"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.599812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c" event={"ID":"44cd96c6-1235-4e8b-b0c3-3184a5aac2b9","Type":"ContainerDied","Data":"34867075780bbaf045d98bacc376ab27c1aaba47d48a7a6665940f139fc69046"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.606485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerStarted","Data":"7505998b6596770a09e952b8929261129dc92a64475fc6f10608fd489414c08e"} Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.606534 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.619162 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" (UID: "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.655221 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data" (OuterVolumeSpecName: "config-data") pod "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" (UID: "44cd96c6-1235-4e8b-b0c3-3184a5aac2b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.661622 5033 scope.go:117] "RemoveContainer" containerID="eba221aef4fbf7142c38b2695a1c83d19c9029bf5ff14fcabd28006046d08c5c" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.661789 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.671909 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plld5\" (UniqueName: \"kubernetes.io/projected/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-kube-api-access-plld5\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.671944 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.671955 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.671963 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.671978 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.677543 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.716168 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717286 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener-log" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717299 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener-log" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717338 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="sg-core" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717347 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="sg-core" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717366 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-central-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717372 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-central-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717379 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="proxy-httpd" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717384 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="proxy-httpd" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717390 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="init" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717395 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="init" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717721 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717729 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717746 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-notification-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717752 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-notification-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.717762 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="dnsmasq-dns" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.717768 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="dnsmasq-dns" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718114 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-notification-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718125 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="proxy-httpd" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718138 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718149 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="ceilometer-central-agent" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718160 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b94ddbe-3dfe-47ee-890c-94e4104ae543" containerName="dnsmasq-dns" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718174 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" containerName="sg-core" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.718187 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" containerName="barbican-keystone-listener-log" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.728171 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.728276 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.736685 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.736934 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.742694 5033 scope.go:117] "RemoveContainer" containerID="2a2dc8e913f5460efd475ada44bceb3723831ce6b4936c432387ab2432bafcb8" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.788345 5033 scope.go:117] "RemoveContainer" containerID="23ed328b240314f21cb36dce1fe08d86bd057bbf640cd0d10843674a204d11b7" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.829252 5033 scope.go:117] "RemoveContainer" containerID="9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.835004 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.854351 5033 scope.go:117] "RemoveContainer" containerID="3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875235 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875265 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875349 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875377 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr56v\" (UniqueName: \"kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.875419 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.883065 5033 scope.go:117] "RemoveContainer" containerID="9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.883714 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5\": container with ID starting with 9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5 not found: ID does not exist" containerID="9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.883751 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5"} err="failed to get container status \"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5\": rpc error: code = NotFound desc = could not find container \"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5\": container with ID starting with 9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5 not found: ID does not exist" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.883771 5033 scope.go:117] "RemoveContainer" containerID="3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" Mar 19 19:16:14 crc kubenswrapper[5033]: E0319 19:16:14.884086 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba\": container with ID starting with 3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba not found: ID does not exist" containerID="3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.884119 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba"} err="failed to get container status \"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba\": rpc error: code = NotFound desc = could not find container \"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba\": container with ID starting with 3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba not found: ID does not exist" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.884146 5033 scope.go:117] "RemoveContainer" containerID="9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.884437 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5"} err="failed to get container status \"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5\": rpc error: code = NotFound desc = could not find container \"9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5\": container with ID starting with 9fb86ebd3e8456763230979e64aa3d8c9cab64daa553b23750dad811e3d53ee5 not found: ID does not exist" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.884469 5033 scope.go:117] "RemoveContainer" containerID="3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.884834 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba"} err="failed to get container status \"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba\": rpc error: code = NotFound desc = could not find container \"3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba\": container with ID starting with 3c3341122483ff37bee50c9309a4a20cc16ba8bd460dc80ab23b7f671c1036ba not found: ID does not exist" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.931096 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.954258 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-69b7dcf4b4-9fq5c"] Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.977752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.977811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr56v\" (UniqueName: \"kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.977835 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.977859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.977912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.978489 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.978625 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.978678 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.979364 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.983963 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.985825 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.987697 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.988244 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:14 crc kubenswrapper[5033]: I0319 19:16:14.997406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr56v\" (UniqueName: \"kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v\") pod \"ceilometer-0\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " pod="openstack/ceilometer-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.054862 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.316035 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.396107 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.396578 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkfdp\" (UniqueName: \"kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.396612 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.397189 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.397223 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.397321 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.397348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts\") pod \"3363e19c-9790-4d10-8f14-6278c9b548d7\" (UID: \"3363e19c-9790-4d10-8f14-6278c9b548d7\") " Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.397984 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs" (OuterVolumeSpecName: "logs") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.398680 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3363e19c-9790-4d10-8f14-6278c9b548d7-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.398723 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.404388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp" (OuterVolumeSpecName: "kube-api-access-vkfdp") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "kube-api-access-vkfdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.407074 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.407579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts" (OuterVolumeSpecName: "scripts") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.469849 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data" (OuterVolumeSpecName: "config-data") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.482781 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3363e19c-9790-4d10-8f14-6278c9b548d7" (UID: "3363e19c-9790-4d10-8f14-6278c9b548d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500830 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500867 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500880 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3363e19c-9790-4d10-8f14-6278c9b548d7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500890 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500900 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3363e19c-9790-4d10-8f14-6278c9b548d7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.500912 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkfdp\" (UniqueName: \"kubernetes.io/projected/3363e19c-9790-4d10-8f14-6278c9b548d7-kube-api-access-vkfdp\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.533564 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:15 crc kubenswrapper[5033]: W0319 19:16:15.557193 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod065ae47e_dafd_4588_b3c6_7340223988ce.slice/crio-1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195 WatchSource:0}: Error finding container 1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195: Status 404 returned error can't find the container with id 1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195 Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.617105 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerStarted","Data":"ada215b2b7e2af73c6ea402dd19ea7be8fa32fcb4d5a687f9dcbbdc15b5a29e1"} Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.621887 5033 generic.go:334] "Generic (PLEG): container finished" podID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerID="97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" exitCode=0 Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.621913 5033 generic.go:334] "Generic (PLEG): container finished" podID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerID="9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" exitCode=143 Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.621923 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.621976 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerDied","Data":"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1"} Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.622021 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerDied","Data":"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3"} Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.622034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3363e19c-9790-4d10-8f14-6278c9b548d7","Type":"ContainerDied","Data":"50ca7aa928853fbbb19c4da54afeceaa9417298e51d2b22e5a13f3f6052b4a1a"} Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.622051 5033 scope.go:117] "RemoveContainer" containerID="97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.627404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerStarted","Data":"1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195"} Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.645766 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.286222883 podStartE2EDuration="6.645748343s" podCreationTimestamp="2026-03-19 19:16:09 +0000 UTC" firstStartedPulling="2026-03-19 19:16:11.449803734 +0000 UTC m=+1181.554833583" lastFinishedPulling="2026-03-19 19:16:12.809329194 +0000 UTC m=+1182.914359043" observedRunningTime="2026-03-19 19:16:15.639245809 +0000 UTC m=+1185.744275658" watchObservedRunningTime="2026-03-19 19:16:15.645748343 +0000 UTC m=+1185.750778192" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.660684 5033 scope.go:117] "RemoveContainer" containerID="9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.662706 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.672244 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.684649 5033 scope.go:117] "RemoveContainer" containerID="97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" Mar 19 19:16:15 crc kubenswrapper[5033]: E0319 19:16:15.688270 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1\": container with ID starting with 97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1 not found: ID does not exist" containerID="97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.688310 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1"} err="failed to get container status \"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1\": rpc error: code = NotFound desc = could not find container \"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1\": container with ID starting with 97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1 not found: ID does not exist" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.688334 5033 scope.go:117] "RemoveContainer" containerID="9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" Mar 19 19:16:15 crc kubenswrapper[5033]: E0319 19:16:15.688882 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3\": container with ID starting with 9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3 not found: ID does not exist" containerID="9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.688903 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3"} err="failed to get container status \"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3\": rpc error: code = NotFound desc = could not find container \"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3\": container with ID starting with 9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3 not found: ID does not exist" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.688916 5033 scope.go:117] "RemoveContainer" containerID="97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.690523 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:15 crc kubenswrapper[5033]: E0319 19:16:15.691268 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691294 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api" Mar 19 19:16:15 crc kubenswrapper[5033]: E0319 19:16:15.691369 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api-log" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691379 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api-log" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691613 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1"} err="failed to get container status \"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1\": rpc error: code = NotFound desc = could not find container \"97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1\": container with ID starting with 97d72befba99eccee24c8b53aeb24cb87829b6f3b9f8b44ff3be5e75433b85c1 not found: ID does not exist" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691642 5033 scope.go:117] "RemoveContainer" containerID="9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691624 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.691698 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" containerName="cinder-api-log" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.692942 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.693657 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3"} err="failed to get container status \"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3\": rpc error: code = NotFound desc = could not find container \"9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3\": container with ID starting with 9c2743e5dccd389bc2ce248061effd7e9633e6396b31f4233b678db1529bd3f3 not found: ID does not exist" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.696631 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.697608 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.697702 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.715382 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.806050 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.806807 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83e7c9da-f095-4e81-8284-feccedc40ce4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.806887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.806929 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-scripts\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.806958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e7c9da-f095-4e81-8284-feccedc40ce4-logs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.807030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2gfc\" (UniqueName: \"kubernetes.io/projected/83e7c9da-f095-4e81-8284-feccedc40ce4-kube-api-access-n2gfc\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.807070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data-custom\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.807116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.807138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908788 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data-custom\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908910 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908939 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83e7c9da-f095-4e81-8284-feccedc40ce4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.908995 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-scripts\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.909018 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e7c9da-f095-4e81-8284-feccedc40ce4-logs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.909075 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2gfc\" (UniqueName: \"kubernetes.io/projected/83e7c9da-f095-4e81-8284-feccedc40ce4-kube-api-access-n2gfc\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.909920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83e7c9da-f095-4e81-8284-feccedc40ce4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.910626 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e7c9da-f095-4e81-8284-feccedc40ce4-logs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.917049 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.917560 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data-custom\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.918959 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-public-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.921289 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-config-data\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.932854 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-scripts\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.933507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e7c9da-f095-4e81-8284-feccedc40ce4-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:15 crc kubenswrapper[5033]: I0319 19:16:15.934265 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2gfc\" (UniqueName: \"kubernetes.io/projected/83e7c9da-f095-4e81-8284-feccedc40ce4-kube-api-access-n2gfc\") pod \"cinder-api-0\" (UID: \"83e7c9da-f095-4e81-8284-feccedc40ce4\") " pod="openstack/cinder-api-0" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.034700 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.297356 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.489342 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.552972 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.553194 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8646b64d4f-xtmzw" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-api" containerID="cri-o://32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a" gracePeriod=30 Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.553312 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8646b64d4f-xtmzw" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" containerID="cri-o://7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980" gracePeriod=30 Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.610841 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-65f99858cc-j489l"] Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.612589 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.639866 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3363e19c-9790-4d10-8f14-6278c9b548d7" path="/var/lib/kubelet/pods/3363e19c-9790-4d10-8f14-6278c9b548d7/volumes" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.640752 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44cd96c6-1235-4e8b-b0c3-3184a5aac2b9" path="/var/lib/kubelet/pods/44cd96c6-1235-4e8b-b0c3-3184a5aac2b9/volumes" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.641363 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a614bc99-b392-49dd-935b-0e59e44fb6f5" path="/var/lib/kubelet/pods/a614bc99-b392-49dd-935b-0e59e44fb6f5/volumes" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.643245 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65f99858cc-j489l"] Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.658803 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerStarted","Data":"ede2b6e3ba941aac764534319359ac39ff2a4d4be8954482a1ba7b44cbdfeb2a"} Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.660970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83e7c9da-f095-4e81-8284-feccedc40ce4","Type":"ContainerStarted","Data":"0a96b6449bb253c475ea8b9e12d473113a5e4865ddf82c4230cacf116bb2c75d"} Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.670231 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8646b64d4f-xtmzw" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9696/\": read tcp 10.217.0.2:33472->10.217.0.177:9696: read: connection reset by peer" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757124 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-internal-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgnx7\" (UniqueName: \"kubernetes.io/projected/00c78e80-6687-41e0-9b1c-57b76358e01f-kube-api-access-pgnx7\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-ovndb-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-public-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757467 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-combined-ca-bundle\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.757593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-httpd-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.859848 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-internal-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgnx7\" (UniqueName: \"kubernetes.io/projected/00c78e80-6687-41e0-9b1c-57b76358e01f-kube-api-access-pgnx7\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-ovndb-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-public-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860413 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-combined-ca-bundle\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860507 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.860558 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-httpd-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.867011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-httpd-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.879785 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-ovndb-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.881029 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-public-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.881631 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-config\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.883646 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgnx7\" (UniqueName: \"kubernetes.io/projected/00c78e80-6687-41e0-9b1c-57b76358e01f-kube-api-access-pgnx7\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.891131 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-combined-ca-bundle\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.896254 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c78e80-6687-41e0-9b1c-57b76358e01f-internal-tls-certs\") pod \"neutron-65f99858cc-j489l\" (UID: \"00c78e80-6687-41e0-9b1c-57b76358e01f\") " pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:16 crc kubenswrapper[5033]: I0319 19:16:16.942951 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.633986 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65f99858cc-j489l"] Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.740281 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83e7c9da-f095-4e81-8284-feccedc40ce4","Type":"ContainerStarted","Data":"0aa9e3615eeac28630fd03923a569a1fb4df89588e3eccfa6ee402828e307876"} Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.743179 5033 generic.go:334] "Generic (PLEG): container finished" podID="412949e0-343a-45ad-873c-ff40cecb82de" containerID="cb2b84a659e89d69c2e18fd3512eda93885f551219cb65b36fdc9660beb9a96d" exitCode=0 Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.743224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-jjb4h" event={"ID":"412949e0-343a-45ad-873c-ff40cecb82de","Type":"ContainerDied","Data":"cb2b84a659e89d69c2e18fd3512eda93885f551219cb65b36fdc9660beb9a96d"} Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.748578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65f99858cc-j489l" event={"ID":"00c78e80-6687-41e0-9b1c-57b76358e01f","Type":"ContainerStarted","Data":"3e2aaced004e7f8f22e27cdf1d85d8de7c5c09ee94a97c51412858177fa2716c"} Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.790683 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerStarted","Data":"f0f03ff28cb2fd6dfd16717139933800e552a46a34678c7bbc6c1fd106625c59"} Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.801628 5033 generic.go:334] "Generic (PLEG): container finished" podID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerID="7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980" exitCode=0 Mar 19 19:16:17 crc kubenswrapper[5033]: I0319 19:16:17.801678 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerDied","Data":"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980"} Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.110028 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8646b64d4f-xtmzw" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9696/\": dial tcp 10.217.0.177:9696: connect: connection refused" Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.787545 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc86df956-fh6cw" Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.824407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65f99858cc-j489l" event={"ID":"00c78e80-6687-41e0-9b1c-57b76358e01f","Type":"ContainerStarted","Data":"c231b4a4be6576035d2f7fd732b37c281a10b81bdf3f99b28bbd15fadbff14e6"} Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.824477 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65f99858cc-j489l" event={"ID":"00c78e80-6687-41e0-9b1c-57b76358e01f","Type":"ContainerStarted","Data":"e6762d152381dee8637a6150d07aaeca855f43664f5a9474f9ecee9b561b0829"} Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.825313 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.830144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerStarted","Data":"7e4658f3b60f3d3b8b5b4f93069cf2b1007b5fe4254ff40ba00262959d1a5943"} Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.850977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83e7c9da-f095-4e81-8284-feccedc40ce4","Type":"ContainerStarted","Data":"3d43e5644fdc2f5cb7d74d943a0073862f68e6047dc8d5adf882e8b67190cc0e"} Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.851759 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.860226 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.860497 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6564b97bbb-lwnvf" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api-log" containerID="cri-o://62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547" gracePeriod=30 Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.862352 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6564b97bbb-lwnvf" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api" containerID="cri-o://14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a" gracePeriod=30 Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.895599 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-65f99858cc-j489l" podStartSLOduration=2.89557589 podStartE2EDuration="2.89557589s" podCreationTimestamp="2026-03-19 19:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:18.86262485 +0000 UTC m=+1188.967654689" watchObservedRunningTime="2026-03-19 19:16:18.89557589 +0000 UTC m=+1189.000605729" Mar 19 19:16:18 crc kubenswrapper[5033]: I0319 19:16:18.916187 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.91616893 podStartE2EDuration="3.91616893s" podCreationTimestamp="2026-03-19 19:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:18.904859891 +0000 UTC m=+1189.009889740" watchObservedRunningTime="2026-03-19 19:16:18.91616893 +0000 UTC m=+1189.021198779" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.389539 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.543617 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data\") pod \"412949e0-343a-45ad-873c-ff40cecb82de\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.543737 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle\") pod \"412949e0-343a-45ad-873c-ff40cecb82de\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.543768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs\") pod \"412949e0-343a-45ad-873c-ff40cecb82de\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.543795 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlms6\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6\") pod \"412949e0-343a-45ad-873c-ff40cecb82de\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.543895 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts\") pod \"412949e0-343a-45ad-873c-ff40cecb82de\" (UID: \"412949e0-343a-45ad-873c-ff40cecb82de\") " Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.575689 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6" (OuterVolumeSpecName: "kube-api-access-qlms6") pod "412949e0-343a-45ad-873c-ff40cecb82de" (UID: "412949e0-343a-45ad-873c-ff40cecb82de"). InnerVolumeSpecName "kube-api-access-qlms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.580656 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs" (OuterVolumeSpecName: "certs") pod "412949e0-343a-45ad-873c-ff40cecb82de" (UID: "412949e0-343a-45ad-873c-ff40cecb82de"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.580763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts" (OuterVolumeSpecName: "scripts") pod "412949e0-343a-45ad-873c-ff40cecb82de" (UID: "412949e0-343a-45ad-873c-ff40cecb82de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.613538 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "412949e0-343a-45ad-873c-ff40cecb82de" (UID: "412949e0-343a-45ad-873c-ff40cecb82de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.624578 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data" (OuterVolumeSpecName: "config-data") pod "412949e0-343a-45ad-873c-ff40cecb82de" (UID: "412949e0-343a-45ad-873c-ff40cecb82de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.631921 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.645945 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.645985 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlms6\" (UniqueName: \"kubernetes.io/projected/412949e0-343a-45ad-873c-ff40cecb82de-kube-api-access-qlms6\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.645994 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.646003 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.646014 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/412949e0-343a-45ad-873c-ff40cecb82de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.801627 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.888163 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-dv4ng"] Mar 19 19:16:19 crc kubenswrapper[5033]: E0319 19:16:19.888605 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412949e0-343a-45ad-873c-ff40cecb82de" containerName="cloudkitty-db-sync" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.888621 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="412949e0-343a-45ad-873c-ff40cecb82de" containerName="cloudkitty-db-sync" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.888808 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="412949e0-343a-45ad-873c-ff40cecb82de" containerName="cloudkitty-db-sync" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.889575 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.941531 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-jjb4h" event={"ID":"412949e0-343a-45ad-873c-ff40cecb82de","Type":"ContainerDied","Data":"924b18f7e4d021a4a838120a69f156ac17723131c128d17703277fbd8485881c"} Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.941863 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924b18f7e4d021a4a838120a69f156ac17723131c128d17703277fbd8485881c" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.941952 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-jjb4h" Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.945553 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-dv4ng"] Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.950412 5033 generic.go:334] "Generic (PLEG): container finished" podID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerID="62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547" exitCode=143 Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.951511 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerDied","Data":"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547"} Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.962528 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:16:19 crc kubenswrapper[5033]: I0319 19:16:19.962801 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="dnsmasq-dns" containerID="cri-o://01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12" gracePeriod=10 Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.068561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.068609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgvzv\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.068662 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.068700 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.068741 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.132049 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.172834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.172887 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgvzv\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.172927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.172956 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.172989 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.180007 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.180441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.181999 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.185072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.194384 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgvzv\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv\") pod \"cloudkitty-storageinit-dv4ng\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.219654 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.233903 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.664835 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.748726 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-dv4ng"] Mar 19 19:16:20 crc kubenswrapper[5033]: W0319 19:16:20.750547 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e68c82_8a51_4824_a4ae_211e63d05144.slice/crio-7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015 WatchSource:0}: Error finding container 7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015: Status 404 returned error can't find the container with id 7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015 Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.809837 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c44mj\" (UniqueName: \"kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.809971 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.810087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.810166 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.810202 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.810234 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb\") pod \"8e2f8699-552e-41fa-9109-033ab974d401\" (UID: \"8e2f8699-552e-41fa-9109-033ab974d401\") " Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.829180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj" (OuterVolumeSpecName: "kube-api-access-c44mj") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "kube-api-access-c44mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.879763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.886069 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.886404 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.899818 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config" (OuterVolumeSpecName: "config") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.901001 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e2f8699-552e-41fa-9109-033ab974d401" (UID: "8e2f8699-552e-41fa-9109-033ab974d401"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914140 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914179 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914194 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914262 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914319 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c44mj\" (UniqueName: \"kubernetes.io/projected/8e2f8699-552e-41fa-9109-033ab974d401-kube-api-access-c44mj\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.914334 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e2f8699-552e-41fa-9109-033ab974d401-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.961963 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e2f8699-552e-41fa-9109-033ab974d401" containerID="01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12" exitCode=0 Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.962054 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" event={"ID":"8e2f8699-552e-41fa-9109-033ab974d401","Type":"ContainerDied","Data":"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12"} Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.962315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" event={"ID":"8e2f8699-552e-41fa-9109-033ab974d401","Type":"ContainerDied","Data":"eba4d60e830e137277d107dfd290678587f630d6c400bee90286f068985f98bf"} Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.962103 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-bbrgq" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.962495 5033 scope.go:117] "RemoveContainer" containerID="01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12" Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.966428 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="cinder-scheduler" containerID="cri-o://7505998b6596770a09e952b8929261129dc92a64475fc6f10608fd489414c08e" gracePeriod=30 Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.966708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-dv4ng" event={"ID":"02e68c82-8a51-4824-a4ae-211e63d05144","Type":"ContainerStarted","Data":"7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015"} Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.966981 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="probe" containerID="cri-o://ada215b2b7e2af73c6ea402dd19ea7be8fa32fcb4d5a687f9dcbbdc15b5a29e1" gracePeriod=30 Mar 19 19:16:20 crc kubenswrapper[5033]: I0319 19:16:20.991185 5033 scope.go:117] "RemoveContainer" containerID="1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9" Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.012122 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.023714 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-bbrgq"] Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.025994 5033 scope.go:117] "RemoveContainer" containerID="01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12" Mar 19 19:16:21 crc kubenswrapper[5033]: E0319 19:16:21.026361 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12\": container with ID starting with 01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12 not found: ID does not exist" containerID="01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12" Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.026404 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12"} err="failed to get container status \"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12\": rpc error: code = NotFound desc = could not find container \"01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12\": container with ID starting with 01f45510188176943a2b41a6b24b29167443119349568a0c0917ec6f90a48a12 not found: ID does not exist" Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.026436 5033 scope.go:117] "RemoveContainer" containerID="1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9" Mar 19 19:16:21 crc kubenswrapper[5033]: E0319 19:16:21.026871 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9\": container with ID starting with 1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9 not found: ID does not exist" containerID="1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9" Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.026898 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9"} err="failed to get container status \"1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9\": rpc error: code = NotFound desc = could not find container \"1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9\": container with ID starting with 1893df613f02e64e7fa1fadcbb72c01452c2decf31d6c4fecd237c6fc04ca2c9 not found: ID does not exist" Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.982497 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-dv4ng" event={"ID":"02e68c82-8a51-4824-a4ae-211e63d05144","Type":"ContainerStarted","Data":"aa2a4b7e07720b3e718c3466ef517985100d4fc5af6da3642c87053908055692"} Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985343 5033 generic.go:334] "Generic (PLEG): container finished" podID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerID="ada215b2b7e2af73c6ea402dd19ea7be8fa32fcb4d5a687f9dcbbdc15b5a29e1" exitCode=0 Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985368 5033 generic.go:334] "Generic (PLEG): container finished" podID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerID="7505998b6596770a09e952b8929261129dc92a64475fc6f10608fd489414c08e" exitCode=0 Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985382 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerDied","Data":"ada215b2b7e2af73c6ea402dd19ea7be8fa32fcb4d5a687f9dcbbdc15b5a29e1"} Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985398 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerDied","Data":"7505998b6596770a09e952b8929261129dc92a64475fc6f10608fd489414c08e"} Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d","Type":"ContainerDied","Data":"abb72c3d3128e2f126bf3c4d2750ab3bb9d0d23f9c0ea0924388962223817d45"} Mar 19 19:16:21 crc kubenswrapper[5033]: I0319 19:16:21.985417 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb72c3d3128e2f126bf3c4d2750ab3bb9d0d23f9c0ea0924388962223817d45" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.007053 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-dv4ng" podStartSLOduration=3.007033164 podStartE2EDuration="3.007033164s" podCreationTimestamp="2026-03-19 19:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:22.003661239 +0000 UTC m=+1192.108691088" watchObservedRunningTime="2026-03-19 19:16:22.007033164 +0000 UTC m=+1192.112063013" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.038921 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.039929 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.040034 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.040068 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.041398 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.053048 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6564b97bbb-lwnvf" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:37324->10.217.0.183:9311: read: connection reset by peer" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.053068 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6564b97bbb-lwnvf" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:37326->10.217.0.183:9311: read: connection reset by peer" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.146293 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzppl\" (UniqueName: \"kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.146349 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.146693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts\") pod \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\" (UID: \"9aead74d-5c2d-4760-8a9f-d2d7957d5d8d\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.147080 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.152595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl" (OuterVolumeSpecName: "kube-api-access-qzppl") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "kube-api-access-qzppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.157090 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts" (OuterVolumeSpecName: "scripts") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.167344 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.217545 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.226708 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data" (OuterVolumeSpecName: "config-data") pod "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" (UID: "9aead74d-5c2d-4760-8a9f-d2d7957d5d8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.248608 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.248647 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.248659 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.248667 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzppl\" (UniqueName: \"kubernetes.io/projected/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-kube-api-access-qzppl\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.248676 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.541079 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.632083 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2f8699-552e-41fa-9109-033ab974d401" path="/var/lib/kubelet/pods/8e2f8699-552e-41fa-9109-033ab974d401/volumes" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.656732 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle\") pod \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.656872 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data\") pod \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.656892 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom\") pod \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.656969 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmfw8\" (UniqueName: \"kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8\") pod \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.657128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs\") pod \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\" (UID: \"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b\") " Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.658046 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs" (OuterVolumeSpecName: "logs") pod "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" (UID: "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.663380 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" (UID: "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.668676 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8" (OuterVolumeSpecName: "kube-api-access-rmfw8") pod "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" (UID: "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b"). InnerVolumeSpecName "kube-api-access-rmfw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.689288 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" (UID: "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.727817 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data" (OuterVolumeSpecName: "config-data") pod "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" (UID: "03288c59-b7b1-47ef-9d63-4ac39bdf0b0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.759062 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.759095 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.759107 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmfw8\" (UniqueName: \"kubernetes.io/projected/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-kube-api-access-rmfw8\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.759116 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:22 crc kubenswrapper[5033]: I0319 19:16:22.759127 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.000191 5033 generic.go:334] "Generic (PLEG): container finished" podID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerID="14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a" exitCode=0 Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.000240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerDied","Data":"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a"} Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.001019 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564b97bbb-lwnvf" event={"ID":"03288c59-b7b1-47ef-9d63-4ac39bdf0b0b","Type":"ContainerDied","Data":"2cb561e1c10583441523c71c7745ad96192bbf6d4fb6968c9c4a31cff35f40e0"} Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.000312 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564b97bbb-lwnvf" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.001046 5033 scope.go:117] "RemoveContainer" containerID="14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.001431 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.040624 5033 scope.go:117] "RemoveContainer" containerID="62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.061601 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.082438 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.094786 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095322 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="probe" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095344 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="probe" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095365 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="cinder-scheduler" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095372 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="cinder-scheduler" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095403 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="dnsmasq-dns" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095411 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="dnsmasq-dns" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095430 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api-log" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095440 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api-log" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095474 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="init" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095483 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="init" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.095492 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095500 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095719 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095745 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" containerName="barbican-api-log" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095762 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="cinder-scheduler" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095777 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2f8699-552e-41fa-9109-033ab974d401" containerName="dnsmasq-dns" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.095796 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" containerName="probe" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.097329 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.100226 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.104220 5033 scope.go:117] "RemoveContainer" containerID="14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.104905 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a\": container with ID starting with 14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a not found: ID does not exist" containerID="14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.104943 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a"} err="failed to get container status \"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a\": rpc error: code = NotFound desc = could not find container \"14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a\": container with ID starting with 14c8cdd62a71b860755910fb817b0f269a6ba002811df39ef5d15fd2502bf29a not found: ID does not exist" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.104968 5033 scope.go:117] "RemoveContainer" containerID="62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547" Mar 19 19:16:23 crc kubenswrapper[5033]: E0319 19:16:23.105308 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547\": container with ID starting with 62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547 not found: ID does not exist" containerID="62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.105334 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547"} err="failed to get container status \"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547\": rpc error: code = NotFound desc = could not find container \"62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547\": container with ID starting with 62bb7d9a4ec3fbc863c7f0fe7c3c12b18eb69205ba3dd3a925ddd3ff755ba547 not found: ID does not exist" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.107577 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.127756 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6564b97bbb-lwnvf"] Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.140117 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.270965 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85d96370-ae45-4b23-b625-71562229c174-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.273478 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5wn\" (UniqueName: \"kubernetes.io/projected/85d96370-ae45-4b23-b625-71562229c174-kube-api-access-7j5wn\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.273646 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.273908 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-scripts\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.273967 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.274085 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.375913 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376021 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-scripts\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376048 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376188 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85d96370-ae45-4b23-b625-71562229c174-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376211 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5wn\" (UniqueName: \"kubernetes.io/projected/85d96370-ae45-4b23-b625-71562229c174-kube-api-access-7j5wn\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.376660 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85d96370-ae45-4b23-b625-71562229c174-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.381176 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.381941 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-config-data\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.382678 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-scripts\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.399242 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d96370-ae45-4b23-b625-71562229c174-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.402132 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5wn\" (UniqueName: \"kubernetes.io/projected/85d96370-ae45-4b23-b625-71562229c174-kube-api-access-7j5wn\") pod \"cinder-scheduler-0\" (UID: \"85d96370-ae45-4b23-b625-71562229c174\") " pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.427695 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.736139 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782249 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782353 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782383 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782500 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782614 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwcl9\" (UniqueName: \"kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782656 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.782730 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config\") pod \"c150d402-0ecb-4631-b49d-da2c50a6f58d\" (UID: \"c150d402-0ecb-4631-b49d-da2c50a6f58d\") " Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.791732 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.792927 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9" (OuterVolumeSpecName: "kube-api-access-dwcl9") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "kube-api-access-dwcl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.839641 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.840030 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.842143 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config" (OuterVolumeSpecName: "config") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.849761 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.863154 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c150d402-0ecb-4631-b49d-da2c50a6f58d" (UID: "c150d402-0ecb-4631-b49d-da2c50a6f58d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885264 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885579 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885648 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885730 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwcl9\" (UniqueName: \"kubernetes.io/projected/c150d402-0ecb-4631-b49d-da2c50a6f58d-kube-api-access-dwcl9\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885793 5033 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885849 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.885905 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c150d402-0ecb-4631-b49d-da2c50a6f58d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:23 crc kubenswrapper[5033]: I0319 19:16:23.956574 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:16:23 crc kubenswrapper[5033]: W0319 19:16:23.962351 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d96370_ae45_4b23_b625_71562229c174.slice/crio-5a9c5786631e585f66ec1d0a09875f2b3b2ae51a46047e034db8a0cd1a73eeaa WatchSource:0}: Error finding container 5a9c5786631e585f66ec1d0a09875f2b3b2ae51a46047e034db8a0cd1a73eeaa: Status 404 returned error can't find the container with id 5a9c5786631e585f66ec1d0a09875f2b3b2ae51a46047e034db8a0cd1a73eeaa Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.014160 5033 generic.go:334] "Generic (PLEG): container finished" podID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerID="32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a" exitCode=0 Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.014217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerDied","Data":"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a"} Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.014239 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8646b64d4f-xtmzw" event={"ID":"c150d402-0ecb-4631-b49d-da2c50a6f58d","Type":"ContainerDied","Data":"ddb836bd9eafd11a4d5e8226055039e26d02b6674f1a590343e2f1342f039c1d"} Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.014257 5033 scope.go:117] "RemoveContainer" containerID="7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.014348 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8646b64d4f-xtmzw" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.018625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85d96370-ae45-4b23-b625-71562229c174","Type":"ContainerStarted","Data":"5a9c5786631e585f66ec1d0a09875f2b3b2ae51a46047e034db8a0cd1a73eeaa"} Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.021776 5033 generic.go:334] "Generic (PLEG): container finished" podID="02e68c82-8a51-4824-a4ae-211e63d05144" containerID="aa2a4b7e07720b3e718c3466ef517985100d4fc5af6da3642c87053908055692" exitCode=0 Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.021812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-dv4ng" event={"ID":"02e68c82-8a51-4824-a4ae-211e63d05144","Type":"ContainerDied","Data":"aa2a4b7e07720b3e718c3466ef517985100d4fc5af6da3642c87053908055692"} Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.049892 5033 scope.go:117] "RemoveContainer" containerID="32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.068291 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.077112 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8646b64d4f-xtmzw"] Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.082343 5033 scope.go:117] "RemoveContainer" containerID="7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980" Mar 19 19:16:24 crc kubenswrapper[5033]: E0319 19:16:24.083430 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980\": container with ID starting with 7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980 not found: ID does not exist" containerID="7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.083528 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980"} err="failed to get container status \"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980\": rpc error: code = NotFound desc = could not find container \"7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980\": container with ID starting with 7cb72edd19088f97dd5b6c07bb78f4944d44aec3d1b85f9f0347b5ea3550b980 not found: ID does not exist" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.083550 5033 scope.go:117] "RemoveContainer" containerID="32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a" Mar 19 19:16:24 crc kubenswrapper[5033]: E0319 19:16:24.085603 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a\": container with ID starting with 32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a not found: ID does not exist" containerID="32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.085646 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a"} err="failed to get container status \"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a\": rpc error: code = NotFound desc = could not find container \"32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a\": container with ID starting with 32d5db75a7f437b97a7deb7d395015be319703138972f0b2884b1919af56a22a not found: ID does not exist" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.646668 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03288c59-b7b1-47ef-9d63-4ac39bdf0b0b" path="/var/lib/kubelet/pods/03288c59-b7b1-47ef-9d63-4ac39bdf0b0b/volumes" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.648371 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aead74d-5c2d-4760-8a9f-d2d7957d5d8d" path="/var/lib/kubelet/pods/9aead74d-5c2d-4760-8a9f-d2d7957d5d8d/volumes" Mar 19 19:16:24 crc kubenswrapper[5033]: I0319 19:16:24.649721 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" path="/var/lib/kubelet/pods/c150d402-0ecb-4631-b49d-da2c50a6f58d/volumes" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.045720 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85d96370-ae45-4b23-b625-71562229c174","Type":"ContainerStarted","Data":"70757b0c2645f8aba77259e0a76226a39b8992a163639f4b6a3a6ce9b21cbe89"} Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.058567 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerStarted","Data":"bdd51a062876d9f7dceedf52293f85b19a2c186dac6d2f59eb290f75b4728515"} Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.058653 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.080904 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.080877788 podStartE2EDuration="2.080877788s" podCreationTimestamp="2026-03-19 19:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:25.067963054 +0000 UTC m=+1195.172992913" watchObservedRunningTime="2026-03-19 19:16:25.080877788 +0000 UTC m=+1195.185907647" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.099084 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7891883589999997 podStartE2EDuration="11.099062161s" podCreationTimestamp="2026-03-19 19:16:14 +0000 UTC" firstStartedPulling="2026-03-19 19:16:15.563698369 +0000 UTC m=+1185.668728218" lastFinishedPulling="2026-03-19 19:16:23.873572171 +0000 UTC m=+1193.978602020" observedRunningTime="2026-03-19 19:16:25.087495775 +0000 UTC m=+1195.192525634" watchObservedRunningTime="2026-03-19 19:16:25.099062161 +0000 UTC m=+1195.204092010" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.533401 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.722203 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs\") pod \"02e68c82-8a51-4824-a4ae-211e63d05144\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.722275 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts\") pod \"02e68c82-8a51-4824-a4ae-211e63d05144\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.722387 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data\") pod \"02e68c82-8a51-4824-a4ae-211e63d05144\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.722423 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgvzv\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv\") pod \"02e68c82-8a51-4824-a4ae-211e63d05144\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.722549 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle\") pod \"02e68c82-8a51-4824-a4ae-211e63d05144\" (UID: \"02e68c82-8a51-4824-a4ae-211e63d05144\") " Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.741793 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv" (OuterVolumeSpecName: "kube-api-access-bgvzv") pod "02e68c82-8a51-4824-a4ae-211e63d05144" (UID: "02e68c82-8a51-4824-a4ae-211e63d05144"). InnerVolumeSpecName "kube-api-access-bgvzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.742798 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts" (OuterVolumeSpecName: "scripts") pod "02e68c82-8a51-4824-a4ae-211e63d05144" (UID: "02e68c82-8a51-4824-a4ae-211e63d05144"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.743380 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs" (OuterVolumeSpecName: "certs") pod "02e68c82-8a51-4824-a4ae-211e63d05144" (UID: "02e68c82-8a51-4824-a4ae-211e63d05144"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.756250 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data" (OuterVolumeSpecName: "config-data") pod "02e68c82-8a51-4824-a4ae-211e63d05144" (UID: "02e68c82-8a51-4824-a4ae-211e63d05144"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.759460 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02e68c82-8a51-4824-a4ae-211e63d05144" (UID: "02e68c82-8a51-4824-a4ae-211e63d05144"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.824970 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.825007 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgvzv\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-kube-api-access-bgvzv\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.825018 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.825043 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02e68c82-8a51-4824-a4ae-211e63d05144-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:25 crc kubenswrapper[5033]: I0319 19:16:25.825052 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e68c82-8a51-4824-a4ae-211e63d05144-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.074256 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-dv4ng" event={"ID":"02e68c82-8a51-4824-a4ae-211e63d05144","Type":"ContainerDied","Data":"7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015"} Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.074292 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d532862baaafac7804a862f76c8cee4365da5537e4fd0b1f8f1be56295e4015" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.074366 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-dv4ng" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.082096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85d96370-ae45-4b23-b625-71562229c174","Type":"ContainerStarted","Data":"3a9ab5a24986f48b707598bc4476a85d7ebacf1aeced7027bb53fb6abe7c731a"} Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.275359 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:26 crc kubenswrapper[5033]: E0319 19:16:26.276220 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e68c82-8a51-4824-a4ae-211e63d05144" containerName="cloudkitty-storageinit" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276236 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e68c82-8a51-4824-a4ae-211e63d05144" containerName="cloudkitty-storageinit" Mar 19 19:16:26 crc kubenswrapper[5033]: E0319 19:16:26.276250 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-api" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276257 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-api" Mar 19 19:16:26 crc kubenswrapper[5033]: E0319 19:16:26.276280 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276288 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276470 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-httpd" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276486 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c150d402-0ecb-4631-b49d-da2c50a6f58d" containerName="neutron-api" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.276500 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e68c82-8a51-4824-a4ae-211e63d05144" containerName="cloudkitty-storageinit" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.277214 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.279329 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.283203 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.283246 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.284167 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tf27p" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.284186 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.298804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360397 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58fk\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360692 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360857 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.360998 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.420584 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.422939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.462615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.463417 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.463558 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.463713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.463862 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464028 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58fk\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464139 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464156 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464259 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgvnt\" (UniqueName: \"kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.464299 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.466494 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.467321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.467943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.469344 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.470643 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.484795 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.496130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58fk\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk\") pod \"cloudkitty-proc-0\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.542208 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.543812 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.547944 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.560069 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567360 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567419 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567654 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567707 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567769 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567840 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.567960 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgvnt\" (UniqueName: \"kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.568006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.568148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.568233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.568278 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wjfd\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.569153 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.569901 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.570612 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.571630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.571953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.597107 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgvnt\" (UniqueName: \"kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt\") pod \"dnsmasq-dns-67bdc55879-b8sch\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.598955 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wjfd\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670792 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670813 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.670851 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.672514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.676871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.681566 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.685358 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.696746 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.700210 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.704984 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wjfd\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd\") pod \"cloudkitty-api-0\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.771537 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:26 crc kubenswrapper[5033]: I0319 19:16:26.891323 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:27 crc kubenswrapper[5033]: I0319 19:16:27.180253 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:27 crc kubenswrapper[5033]: I0319 19:16:27.322942 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:16:27 crc kubenswrapper[5033]: W0319 19:16:27.330805 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75101984_de53_4709_8870_e919c64bfd54.slice/crio-ef9cfde990829b38f7f10334fb10f29a9e5971543ac62030e1a2d4154e57ce9e WatchSource:0}: Error finding container ef9cfde990829b38f7f10334fb10f29a9e5971543ac62030e1a2d4154e57ce9e: Status 404 returned error can't find the container with id ef9cfde990829b38f7f10334fb10f29a9e5971543ac62030e1a2d4154e57ce9e Mar 19 19:16:27 crc kubenswrapper[5033]: I0319 19:16:27.490697 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:27 crc kubenswrapper[5033]: W0319 19:16:27.509716 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc3695b_4a79_4a64_b002_6f972e76bc1f.slice/crio-18a93fc5a77bf716640269cd30da0af0d808bd559990d180624c5102e13a3f0c WatchSource:0}: Error finding container 18a93fc5a77bf716640269cd30da0af0d808bd559990d180624c5102e13a3f0c: Status 404 returned error can't find the container with id 18a93fc5a77bf716640269cd30da0af0d808bd559990d180624c5102e13a3f0c Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.111694 5033 generic.go:334] "Generic (PLEG): container finished" podID="75101984-de53-4709-8870-e919c64bfd54" containerID="0f15bbdf214a5e7bb690de6c79b53a54ccda584ebce1f9f98545692503ca02bd" exitCode=0 Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.112007 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" event={"ID":"75101984-de53-4709-8870-e919c64bfd54","Type":"ContainerDied","Data":"0f15bbdf214a5e7bb690de6c79b53a54ccda584ebce1f9f98545692503ca02bd"} Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.112033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" event={"ID":"75101984-de53-4709-8870-e919c64bfd54","Type":"ContainerStarted","Data":"ef9cfde990829b38f7f10334fb10f29a9e5971543ac62030e1a2d4154e57ce9e"} Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.114277 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerStarted","Data":"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88"} Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.114299 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerStarted","Data":"18a93fc5a77bf716640269cd30da0af0d808bd559990d180624c5102e13a3f0c"} Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.115147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0a208460-9d8a-41ab-a21e-8a932fcb2dba","Type":"ContainerStarted","Data":"ad4d52f0702aa640a003939d6bd942d6ec92bfa7955d9d67b9ed739e45f299c9"} Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.428912 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 19:16:28 crc kubenswrapper[5033]: I0319 19:16:28.548495 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.141617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerStarted","Data":"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13"} Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.141921 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.145679 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0a208460-9d8a-41ab-a21e-8a932fcb2dba","Type":"ContainerStarted","Data":"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94"} Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.163939 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" event={"ID":"75101984-de53-4709-8870-e919c64bfd54","Type":"ContainerStarted","Data":"cdc2e359cb0222d0101827b6ba4c1c7166da08d5a4193628d49cb624e639a883"} Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.165183 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.184290 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.184268356 podStartE2EDuration="3.184268356s" podCreationTimestamp="2026-03-19 19:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:29.16560297 +0000 UTC m=+1199.270632839" watchObservedRunningTime="2026-03-19 19:16:29.184268356 +0000 UTC m=+1199.289298205" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.199386 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.244273869 podStartE2EDuration="3.199365612s" podCreationTimestamp="2026-03-19 19:16:26 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.190346107 +0000 UTC m=+1197.295375956" lastFinishedPulling="2026-03-19 19:16:28.14543786 +0000 UTC m=+1198.250467699" observedRunningTime="2026-03-19 19:16:29.197921081 +0000 UTC m=+1199.302950940" watchObservedRunningTime="2026-03-19 19:16:29.199365612 +0000 UTC m=+1199.304395461" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.229486 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" podStartSLOduration=3.229463681 podStartE2EDuration="3.229463681s" podCreationTimestamp="2026-03-19 19:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:29.223829622 +0000 UTC m=+1199.328859471" watchObservedRunningTime="2026-03-19 19:16:29.229463681 +0000 UTC m=+1199.334493530" Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.618044 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:29 crc kubenswrapper[5033]: I0319 19:16:29.655619 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:31 crc kubenswrapper[5033]: I0319 19:16:31.190989 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" containerName="cloudkitty-proc" containerID="cri-o://c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94" gracePeriod=30 Mar 19 19:16:31 crc kubenswrapper[5033]: I0319 19:16:31.191090 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api-log" containerID="cri-o://aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" gracePeriod=30 Mar 19 19:16:31 crc kubenswrapper[5033]: I0319 19:16:31.191259 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api" containerID="cri-o://cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" gracePeriod=30 Mar 19 19:16:31 crc kubenswrapper[5033]: I0319 19:16:31.960055 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111612 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111713 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wjfd\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111742 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111798 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111869 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.111916 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data\") pod \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\" (UID: \"2cc3695b-4a79-4a64-b002-6f972e76bc1f\") " Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.114725 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs" (OuterVolumeSpecName: "logs") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.122673 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs" (OuterVolumeSpecName: "certs") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.125597 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts" (OuterVolumeSpecName: "scripts") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.126659 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd" (OuterVolumeSpecName: "kube-api-access-4wjfd") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "kube-api-access-4wjfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.141396 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.163630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.167601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data" (OuterVolumeSpecName: "config-data") pod "2cc3695b-4a79-4a64-b002-6f972e76bc1f" (UID: "2cc3695b-4a79-4a64-b002-6f972e76bc1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199659 5033 generic.go:334] "Generic (PLEG): container finished" podID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerID="cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" exitCode=0 Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199689 5033 generic.go:334] "Generic (PLEG): container finished" podID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerID="aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" exitCode=143 Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerDied","Data":"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13"} Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199733 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerDied","Data":"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88"} Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199742 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2cc3695b-4a79-4a64-b002-6f972e76bc1f","Type":"ContainerDied","Data":"18a93fc5a77bf716640269cd30da0af0d808bd559990d180624c5102e13a3f0c"} Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199757 5033 scope.go:117] "RemoveContainer" containerID="cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.199859 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213774 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213797 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213805 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213814 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wjfd\" (UniqueName: \"kubernetes.io/projected/2cc3695b-4a79-4a64-b002-6f972e76bc1f-kube-api-access-4wjfd\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213825 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213838 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cc3695b-4a79-4a64-b002-6f972e76bc1f-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.213848 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc3695b-4a79-4a64-b002-6f972e76bc1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.236707 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.245373 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.251079 5033 scope.go:117] "RemoveContainer" containerID="aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.268594 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:32 crc kubenswrapper[5033]: E0319 19:16:32.269110 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api-log" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.269128 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api-log" Mar 19 19:16:32 crc kubenswrapper[5033]: E0319 19:16:32.269145 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.269151 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.269331 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api-log" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.269364 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" containerName="cloudkitty-api" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.270385 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.274350 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d44c58694-mkj7x" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.274505 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.274839 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.275493 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.275663 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.309584 5033 scope.go:117] "RemoveContainer" containerID="cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.311884 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:32 crc kubenswrapper[5033]: E0319 19:16:32.314602 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13\": container with ID starting with cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13 not found: ID does not exist" containerID="cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.314646 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13"} err="failed to get container status \"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13\": rpc error: code = NotFound desc = could not find container \"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13\": container with ID starting with cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13 not found: ID does not exist" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.314669 5033 scope.go:117] "RemoveContainer" containerID="aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" Mar 19 19:16:32 crc kubenswrapper[5033]: E0319 19:16:32.315070 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88\": container with ID starting with aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88 not found: ID does not exist" containerID="aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.315089 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88"} err="failed to get container status \"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88\": rpc error: code = NotFound desc = could not find container \"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88\": container with ID starting with aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88 not found: ID does not exist" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.315101 5033 scope.go:117] "RemoveContainer" containerID="cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.315427 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13"} err="failed to get container status \"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13\": rpc error: code = NotFound desc = could not find container \"cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13\": container with ID starting with cb87cc895ae1b601f127f9b1f3b79fb3bb625840948afa39da763ea20ab3ed13 not found: ID does not exist" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.315461 5033 scope.go:117] "RemoveContainer" containerID="aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.315847 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88"} err="failed to get container status \"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88\": rpc error: code = NotFound desc = could not find container \"aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88\": container with ID starting with aa3ff843d6baa78eb2a7c252957baa281f4ffd6c2ec9078ca2b97b01803b6e88 not found: ID does not exist" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.406061 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430279 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430414 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430531 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmcrs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.430733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.532852 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.532918 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.532967 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmcrs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.532997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533073 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533106 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.533972 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.537016 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.540223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.541502 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.543261 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.543424 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.543872 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.545915 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.553884 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmcrs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs\") pod \"cloudkitty-api-0\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.637759 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.658864 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc3695b-4a79-4a64-b002-6f972e76bc1f" path="/var/lib/kubelet/pods/2cc3695b-4a79-4a64-b002-6f972e76bc1f/volumes" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.811878 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.824437 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66c5bf8b4d-dnhrv" Mar 19 19:16:32 crc kubenswrapper[5033]: I0319 19:16:32.896587 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:16:33 crc kubenswrapper[5033]: I0319 19:16:33.144627 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:16:33 crc kubenswrapper[5033]: I0319 19:16:33.208617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerStarted","Data":"9f954ce971d568130875ae50f9ec4965ce90353c0c93ff35b293c94c195d24c0"} Mar 19 19:16:33 crc kubenswrapper[5033]: I0319 19:16:33.686515 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 19:16:34 crc kubenswrapper[5033]: I0319 19:16:34.222778 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerStarted","Data":"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b"} Mar 19 19:16:34 crc kubenswrapper[5033]: I0319 19:16:34.222825 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerStarted","Data":"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64"} Mar 19 19:16:34 crc kubenswrapper[5033]: I0319 19:16:34.222861 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67c467f5b-st865" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-log" containerID="cri-o://6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23" gracePeriod=30 Mar 19 19:16:34 crc kubenswrapper[5033]: I0319 19:16:34.223019 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-67c467f5b-st865" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-api" containerID="cri-o://91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c" gracePeriod=30 Mar 19 19:16:34 crc kubenswrapper[5033]: I0319 19:16:34.252439 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.2524187319999998 podStartE2EDuration="2.252418732s" podCreationTimestamp="2026-03-19 19:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:34.246541496 +0000 UTC m=+1204.351571365" watchObservedRunningTime="2026-03-19 19:16:34.252418732 +0000 UTC m=+1204.357448581" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.053587 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211194 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211249 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211294 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211373 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.211417 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r58fk\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk\") pod \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\" (UID: \"0a208460-9d8a-41ab-a21e-8a932fcb2dba\") " Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.219668 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.223638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs" (OuterVolumeSpecName: "certs") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.223638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts" (OuterVolumeSpecName: "scripts") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.223858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk" (OuterVolumeSpecName: "kube-api-access-r58fk") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "kube-api-access-r58fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.237054 5033 generic.go:334] "Generic (PLEG): container finished" podID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerID="6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23" exitCode=143 Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.237129 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerDied","Data":"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23"} Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.238572 5033 generic.go:334] "Generic (PLEG): container finished" podID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" containerID="c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94" exitCode=0 Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.238647 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.238669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0a208460-9d8a-41ab-a21e-8a932fcb2dba","Type":"ContainerDied","Data":"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94"} Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.238687 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0a208460-9d8a-41ab-a21e-8a932fcb2dba","Type":"ContainerDied","Data":"ad4d52f0702aa640a003939d6bd942d6ec92bfa7955d9d67b9ed739e45f299c9"} Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.238702 5033 scope.go:117] "RemoveContainer" containerID="c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.239233 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.240180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data" (OuterVolumeSpecName: "config-data") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.251784 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a208460-9d8a-41ab-a21e-8a932fcb2dba" (UID: "0a208460-9d8a-41ab-a21e-8a932fcb2dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.287128 5033 scope.go:117] "RemoveContainer" containerID="c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94" Mar 19 19:16:35 crc kubenswrapper[5033]: E0319 19:16:35.288092 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94\": container with ID starting with c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94 not found: ID does not exist" containerID="c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.288128 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94"} err="failed to get container status \"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94\": rpc error: code = NotFound desc = could not find container \"c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94\": container with ID starting with c13f01441bc75f07c724306303cf0d6cb6b415de57cc8f7980f69e0a6efe5a94 not found: ID does not exist" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313798 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313822 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313831 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313839 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313847 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a208460-9d8a-41ab-a21e-8a932fcb2dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.313855 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r58fk\" (UniqueName: \"kubernetes.io/projected/0a208460-9d8a-41ab-a21e-8a932fcb2dba-kube-api-access-r58fk\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.604364 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.627307 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.642387 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:35 crc kubenswrapper[5033]: E0319 19:16:35.643085 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" containerName="cloudkitty-proc" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.643211 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" containerName="cloudkitty-proc" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.643565 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" containerName="cloudkitty-proc" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.645626 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.652266 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.657160 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.669980 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.672489 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.674641 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.674692 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.674806 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2dxfs" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.679755 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726674 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726813 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckczf\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.726988 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829164 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829349 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829488 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4n7j\" (UniqueName: \"kubernetes.io/projected/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-kube-api-access-k4n7j\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829635 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829721 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829754 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckczf\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.829820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.833530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.833551 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.833614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.833743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.835169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.850862 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckczf\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf\") pod \"cloudkitty-proc-0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.932266 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.932575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.932742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.932839 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4n7j\" (UniqueName: \"kubernetes.io/projected/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-kube-api-access-k4n7j\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.933430 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.935059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.936093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.953061 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4n7j\" (UniqueName: \"kubernetes.io/projected/d14b9715-0fd4-48c5-8531-42c4d60ec6e6-kube-api-access-k4n7j\") pod \"openstackclient\" (UID: \"d14b9715-0fd4-48c5-8531-42c4d60ec6e6\") " pod="openstack/openstackclient" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.966363 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:16:35 crc kubenswrapper[5033]: I0319 19:16:35.998645 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.486656 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.605213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 19:16:36 crc kubenswrapper[5033]: W0319 19:16:36.607908 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd14b9715_0fd4_48c5_8531_42c4d60ec6e6.slice/crio-51e3e284ea7c8015011deb1dadc756ff18bbaa6db5180d3cff95e94228a9702d WatchSource:0}: Error finding container 51e3e284ea7c8015011deb1dadc756ff18bbaa6db5180d3cff95e94228a9702d: Status 404 returned error can't find the container with id 51e3e284ea7c8015011deb1dadc756ff18bbaa6db5180d3cff95e94228a9702d Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.635276 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a208460-9d8a-41ab-a21e-8a932fcb2dba" path="/var/lib/kubelet/pods/0a208460-9d8a-41ab-a21e-8a932fcb2dba/volumes" Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.774591 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.836580 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:36 crc kubenswrapper[5033]: I0319 19:16:36.836803 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="dnsmasq-dns" containerID="cri-o://4758a0cb959bfe4c15076c9fc66a3e8f9836850dcfa189b72706b28bc8c2e228" gracePeriod=10 Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.259334 5033 generic.go:334] "Generic (PLEG): container finished" podID="6fb297fd-2058-4055-89f9-d74164243306" containerID="4758a0cb959bfe4c15076c9fc66a3e8f9836850dcfa189b72706b28bc8c2e228" exitCode=0 Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.259552 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" event={"ID":"6fb297fd-2058-4055-89f9-d74164243306","Type":"ContainerDied","Data":"4758a0cb959bfe4c15076c9fc66a3e8f9836850dcfa189b72706b28bc8c2e228"} Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.261037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1d668e98-8678-40bf-8c6c-f44f1937c5a0","Type":"ContainerStarted","Data":"72f15a7ac2c4129a38ffe07a8d6244a8da70fb44210764b9531ef198c1406225"} Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.261065 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1d668e98-8678-40bf-8c6c-f44f1937c5a0","Type":"ContainerStarted","Data":"1e202adfc8c92f6f20e9e9e635f0b2b68438541109bf1a5fde9cf80635288087"} Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.262296 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d14b9715-0fd4-48c5-8531-42c4d60ec6e6","Type":"ContainerStarted","Data":"51e3e284ea7c8015011deb1dadc756ff18bbaa6db5180d3cff95e94228a9702d"} Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.276246 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.276226064 podStartE2EDuration="2.276226064s" podCreationTimestamp="2026-03-19 19:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:37.275403471 +0000 UTC m=+1207.380433330" watchObservedRunningTime="2026-03-19 19:16:37.276226064 +0000 UTC m=+1207.381255923" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.342941 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.466192 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.467036 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.467153 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.467187 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.467254 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbk96\" (UniqueName: \"kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.467292 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb\") pod \"6fb297fd-2058-4055-89f9-d74164243306\" (UID: \"6fb297fd-2058-4055-89f9-d74164243306\") " Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.492831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96" (OuterVolumeSpecName: "kube-api-access-tbk96") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "kube-api-access-tbk96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.523197 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.526095 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.528054 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.529079 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config" (OuterVolumeSpecName: "config") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.529209 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fb297fd-2058-4055-89f9-d74164243306" (UID: "6fb297fd-2058-4055-89f9-d74164243306"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.569957 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.569996 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.570012 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbk96\" (UniqueName: \"kubernetes.io/projected/6fb297fd-2058-4055-89f9-d74164243306-kube-api-access-tbk96\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.570022 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.570031 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:37 crc kubenswrapper[5033]: I0319 19:16:37.570040 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fb297fd-2058-4055-89f9-d74164243306-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.061544 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202648 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202705 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22hc\" (UniqueName: \"kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202728 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202785 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.202943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs\") pod \"a8c04e38-eef9-4935-a82a-1721e8b49be5\" (UID: \"a8c04e38-eef9-4935-a82a-1721e8b49be5\") " Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.204553 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs" (OuterVolumeSpecName: "logs") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.211172 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts" (OuterVolumeSpecName: "scripts") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.211433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc" (OuterVolumeSpecName: "kube-api-access-w22hc") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "kube-api-access-w22hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.256047 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data" (OuterVolumeSpecName: "config-data") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.295501 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" event={"ID":"6fb297fd-2058-4055-89f9-d74164243306","Type":"ContainerDied","Data":"70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6"} Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.295551 5033 scope.go:117] "RemoveContainer" containerID="4758a0cb959bfe4c15076c9fc66a3e8f9836850dcfa189b72706b28bc8c2e228" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.295679 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-42kx2" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.298902 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.303192 5033 generic.go:334] "Generic (PLEG): container finished" podID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerID="91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c" exitCode=0 Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.303242 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerDied","Data":"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c"} Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.303303 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c467f5b-st865" event={"ID":"a8c04e38-eef9-4935-a82a-1721e8b49be5","Type":"ContainerDied","Data":"afcba1637dbbf8a867c3e3043c25317cee621a868ba787956065d18890d27c39"} Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.303354 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c467f5b-st865" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.305824 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.305863 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22hc\" (UniqueName: \"kubernetes.io/projected/a8c04e38-eef9-4935-a82a-1721e8b49be5-kube-api-access-w22hc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.305876 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.305884 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.305893 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8c04e38-eef9-4935-a82a-1721e8b49be5-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.356676 5033 scope.go:117] "RemoveContainer" containerID="72c546d3784520ab3ebc46592b98c1c828d81af78c917c9460e53d162b0ba0f1" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.356685 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.358524 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a8c04e38-eef9-4935-a82a-1721e8b49be5" (UID: "a8c04e38-eef9-4935-a82a-1721e8b49be5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.368833 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.382683 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-42kx2"] Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.407718 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.407749 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8c04e38-eef9-4935-a82a-1721e8b49be5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.489400 5033 scope.go:117] "RemoveContainer" containerID="91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c" Mar 19 19:16:38 crc kubenswrapper[5033]: E0319 19:16:38.503914 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fb297fd_2058_4055_89f9_d74164243306.slice/crio-70233a1aefec2fb3ac2ee4a8206d79119748b14596988b4be2abaea3863dc0a6\": RecentStats: unable to find data in memory cache]" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.547852 5033 scope.go:117] "RemoveContainer" containerID="6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.639404 5033 scope.go:117] "RemoveContainer" containerID="91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c" Mar 19 19:16:38 crc kubenswrapper[5033]: E0319 19:16:38.642292 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c\": container with ID starting with 91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c not found: ID does not exist" containerID="91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.642325 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c"} err="failed to get container status \"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c\": rpc error: code = NotFound desc = could not find container \"91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c\": container with ID starting with 91296d52149ed27bcce2e74e4a8c499c17a20eb239f07dfe756857278af7441c not found: ID does not exist" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.642347 5033 scope.go:117] "RemoveContainer" containerID="6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23" Mar 19 19:16:38 crc kubenswrapper[5033]: E0319 19:16:38.646599 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23\": container with ID starting with 6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23 not found: ID does not exist" containerID="6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.646625 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23"} err="failed to get container status \"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23\": rpc error: code = NotFound desc = could not find container \"6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23\": container with ID starting with 6039a6d70ba8bc3b2a800dacedf215a44b767943934b45c2a228fae64c685b23 not found: ID does not exist" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.661939 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb297fd-2058-4055-89f9-d74164243306" path="/var/lib/kubelet/pods/6fb297fd-2058-4055-89f9-d74164243306/volumes" Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.686509 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:16:38 crc kubenswrapper[5033]: I0319 19:16:38.705191 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-67c467f5b-st865"] Mar 19 19:16:40 crc kubenswrapper[5033]: I0319 19:16:40.636049 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" path="/var/lib/kubelet/pods/a8c04e38-eef9-4935-a82a-1721e8b49be5/volumes" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.644811 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6d74595b67-9drnj"] Mar 19 19:16:42 crc kubenswrapper[5033]: E0319 19:16:42.645503 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="dnsmasq-dns" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645523 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="dnsmasq-dns" Mar 19 19:16:42 crc kubenswrapper[5033]: E0319 19:16:42.645544 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-log" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645552 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-log" Mar 19 19:16:42 crc kubenswrapper[5033]: E0319 19:16:42.645582 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="init" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645589 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="init" Mar 19 19:16:42 crc kubenswrapper[5033]: E0319 19:16:42.645611 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-api" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645622 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-api" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645864 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-log" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645880 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb297fd-2058-4055-89f9-d74164243306" containerName="dnsmasq-dns" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.645897 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c04e38-eef9-4935-a82a-1721e8b49be5" containerName="placement-api" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.647929 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.652999 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.653102 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.653337 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.675350 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d74595b67-9drnj"] Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.798735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-internal-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.798962 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-combined-ca-bundle\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799022 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-config-data\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799168 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-etc-swift\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799729 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-log-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799793 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcsc5\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-kube-api-access-xcsc5\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-run-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.799883 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-public-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.872892 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.873174 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-central-agent" containerID="cri-o://ede2b6e3ba941aac764534319359ac39ff2a4d4be8954482a1ba7b44cbdfeb2a" gracePeriod=30 Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.873248 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" containerID="cri-o://bdd51a062876d9f7dceedf52293f85b19a2c186dac6d2f59eb290f75b4728515" gracePeriod=30 Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.873289 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="sg-core" containerID="cri-o://7e4658f3b60f3d3b8b5b4f93069cf2b1007b5fe4254ff40ba00262959d1a5943" gracePeriod=30 Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.873303 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-notification-agent" containerID="cri-o://f0f03ff28cb2fd6dfd16717139933800e552a46a34678c7bbc6c1fd106625c59" gracePeriod=30 Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.882733 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.193:3000/\": EOF" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902258 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-internal-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902337 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-combined-ca-bundle\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902388 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-config-data\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902460 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-etc-swift\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902583 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-log-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902621 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcsc5\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-kube-api-access-xcsc5\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902645 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-run-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.902856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-public-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.903712 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-run-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.903760 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66364c56-ef15-437e-8508-2f7b2c4471f8-log-httpd\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.910180 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-config-data\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.910299 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-public-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.910396 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-internal-tls-certs\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.917156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66364c56-ef15-437e-8508-2f7b2c4471f8-combined-ca-bundle\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.918905 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-etc-swift\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.926338 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcsc5\" (UniqueName: \"kubernetes.io/projected/66364c56-ef15-437e-8508-2f7b2c4471f8-kube-api-access-xcsc5\") pod \"swift-proxy-6d74595b67-9drnj\" (UID: \"66364c56-ef15-437e-8508-2f7b2c4471f8\") " pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:42 crc kubenswrapper[5033]: I0319 19:16:42.977007 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.381619 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerID="e97b9904114bf1bd110bbeec608efc56279fee7a1162d07224267338bca12e65" exitCode=137 Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.381706 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerDied","Data":"e97b9904114bf1bd110bbeec608efc56279fee7a1162d07224267338bca12e65"} Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385132 5033 generic.go:334] "Generic (PLEG): container finished" podID="065ae47e-dafd-4588-b3c6-7340223988ce" containerID="bdd51a062876d9f7dceedf52293f85b19a2c186dac6d2f59eb290f75b4728515" exitCode=0 Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385159 5033 generic.go:334] "Generic (PLEG): container finished" podID="065ae47e-dafd-4588-b3c6-7340223988ce" containerID="7e4658f3b60f3d3b8b5b4f93069cf2b1007b5fe4254ff40ba00262959d1a5943" exitCode=2 Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385167 5033 generic.go:334] "Generic (PLEG): container finished" podID="065ae47e-dafd-4588-b3c6-7340223988ce" containerID="ede2b6e3ba941aac764534319359ac39ff2a4d4be8954482a1ba7b44cbdfeb2a" exitCode=0 Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerDied","Data":"bdd51a062876d9f7dceedf52293f85b19a2c186dac6d2f59eb290f75b4728515"} Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385542 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerDied","Data":"7e4658f3b60f3d3b8b5b4f93069cf2b1007b5fe4254ff40ba00262959d1a5943"} Mar 19 19:16:43 crc kubenswrapper[5033]: I0319 19:16:43.385628 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerDied","Data":"ede2b6e3ba941aac764534319359ac39ff2a4d4be8954482a1ba7b44cbdfeb2a"} Mar 19 19:16:44 crc kubenswrapper[5033]: I0319 19:16:44.407083 5033 generic.go:334] "Generic (PLEG): container finished" podID="065ae47e-dafd-4588-b3c6-7340223988ce" containerID="f0f03ff28cb2fd6dfd16717139933800e552a46a34678c7bbc6c1fd106625c59" exitCode=0 Mar 19 19:16:44 crc kubenswrapper[5033]: I0319 19:16:44.407132 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerDied","Data":"f0f03ff28cb2fd6dfd16717139933800e552a46a34678c7bbc6c1fd106625c59"} Mar 19 19:16:45 crc kubenswrapper[5033]: I0319 19:16:45.056921 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.193:3000/\": dial tcp 10.217.0.193:3000: connect: connection refused" Mar 19 19:16:46 crc kubenswrapper[5033]: I0319 19:16:46.953912 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-65f99858cc-j489l" Mar 19 19:16:47 crc kubenswrapper[5033]: I0319 19:16:47.019596 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:16:47 crc kubenswrapper[5033]: I0319 19:16:47.019839 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f8fddc48-z2hb6" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-api" containerID="cri-o://a36446b46b023b1c81c31f698eef52d286ae4ecabe9d883ac720569f66d7cb6e" gracePeriod=30 Mar 19 19:16:47 crc kubenswrapper[5033]: I0319 19:16:47.019932 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f8fddc48-z2hb6" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-httpd" containerID="cri-o://43b6cf2576b94a1750263249bcfadb96a5fda6a6440e91e30e3fdeb466699612" gracePeriod=30 Mar 19 19:16:47 crc kubenswrapper[5033]: I0319 19:16:47.466952 5033 generic.go:334] "Generic (PLEG): container finished" podID="be100139-e76b-4549-a7a7-652dcd18354b" containerID="43b6cf2576b94a1750263249bcfadb96a5fda6a6440e91e30e3fdeb466699612" exitCode=0 Mar 19 19:16:47 crc kubenswrapper[5033]: I0319 19:16:47.467020 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerDied","Data":"43b6cf2576b94a1750263249bcfadb96a5fda6a6440e91e30e3fdeb466699612"} Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.183914 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.351682 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.359363 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle\") pod \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.359564 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klfjz\" (UniqueName: \"kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz\") pod \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.359613 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs\") pod \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.359649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom\") pod \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.359820 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data\") pod \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\" (UID: \"f9761bae-b27d-49c4-8ad3-f8001ecc3111\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.360227 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs" (OuterVolumeSpecName: "logs") pod "f9761bae-b27d-49c4-8ad3-f8001ecc3111" (UID: "f9761bae-b27d-49c4-8ad3-f8001ecc3111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.362318 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9761bae-b27d-49c4-8ad3-f8001ecc3111-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.366750 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9761bae-b27d-49c4-8ad3-f8001ecc3111" (UID: "f9761bae-b27d-49c4-8ad3-f8001ecc3111"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.366963 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz" (OuterVolumeSpecName: "kube-api-access-klfjz") pod "f9761bae-b27d-49c4-8ad3-f8001ecc3111" (UID: "f9761bae-b27d-49c4-8ad3-f8001ecc3111"). InnerVolumeSpecName "kube-api-access-klfjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.401442 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9761bae-b27d-49c4-8ad3-f8001ecc3111" (UID: "f9761bae-b27d-49c4-8ad3-f8001ecc3111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.461765 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data" (OuterVolumeSpecName: "config-data") pod "f9761bae-b27d-49c4-8ad3-f8001ecc3111" (UID: "f9761bae-b27d-49c4-8ad3-f8001ecc3111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463579 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463657 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463691 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463721 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr56v\" (UniqueName: \"kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463824 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463910 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.463947 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd\") pod \"065ae47e-dafd-4588-b3c6-7340223988ce\" (UID: \"065ae47e-dafd-4588-b3c6-7340223988ce\") " Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464352 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464442 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464474 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klfjz\" (UniqueName: \"kubernetes.io/projected/f9761bae-b27d-49c4-8ad3-f8001ecc3111-kube-api-access-klfjz\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464485 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464496 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464507 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9761bae-b27d-49c4-8ad3-f8001ecc3111-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.464594 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.469276 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts" (OuterVolumeSpecName: "scripts") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.471580 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v" (OuterVolumeSpecName: "kube-api-access-qr56v") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "kube-api-access-qr56v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.497938 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d14b9715-0fd4-48c5-8531-42c4d60ec6e6","Type":"ContainerStarted","Data":"4eda40b80e7009d7e32defda2a4d76013866245f3ac1d6b9215a6f8010d8b945"} Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.501523 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.501707 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-756cc89c77-vzp6q" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.502103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-756cc89c77-vzp6q" event={"ID":"f9761bae-b27d-49c4-8ad3-f8001ecc3111","Type":"ContainerDied","Data":"6ad67dc2d5b6a566b22fab555867cb52b8de8bd78da5a329257ac91b54dfe4e2"} Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.502169 5033 scope.go:117] "RemoveContainer" containerID="e97b9904114bf1bd110bbeec608efc56279fee7a1162d07224267338bca12e65" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.516017 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.204387398 podStartE2EDuration="13.515999483s" podCreationTimestamp="2026-03-19 19:16:35 +0000 UTC" firstStartedPulling="2026-03-19 19:16:36.612440355 +0000 UTC m=+1206.717470204" lastFinishedPulling="2026-03-19 19:16:47.92405243 +0000 UTC m=+1218.029082289" observedRunningTime="2026-03-19 19:16:48.511070064 +0000 UTC m=+1218.616099913" watchObservedRunningTime="2026-03-19 19:16:48.515999483 +0000 UTC m=+1218.621029322" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.522785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"065ae47e-dafd-4588-b3c6-7340223988ce","Type":"ContainerDied","Data":"1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195"} Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.522872 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.546611 5033 scope.go:117] "RemoveContainer" containerID="9935be95c1eb332bcb24bb3e725730f00388cf48e552e7e0bd31e7471de5ad85" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.547482 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.559898 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-756cc89c77-vzp6q"] Mar 19 19:16:48 crc kubenswrapper[5033]: W0319 19:16:48.560079 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66364c56_ef15_437e_8508_2f7b2c4471f8.slice/crio-410bd6b5ca36214170edc7bf0fb64428df4ac7e951166d81be63ac73bd9dc372 WatchSource:0}: Error finding container 410bd6b5ca36214170edc7bf0fb64428df4ac7e951166d81be63ac73bd9dc372: Status 404 returned error can't find the container with id 410bd6b5ca36214170edc7bf0fb64428df4ac7e951166d81be63ac73bd9dc372 Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.565954 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.565979 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr56v\" (UniqueName: \"kubernetes.io/projected/065ae47e-dafd-4588-b3c6-7340223988ce-kube-api-access-qr56v\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.565988 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.566001 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/065ae47e-dafd-4588-b3c6-7340223988ce-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.574387 5033 scope.go:117] "RemoveContainer" containerID="bdd51a062876d9f7dceedf52293f85b19a2c186dac6d2f59eb290f75b4728515" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.575289 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d74595b67-9drnj"] Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.594869 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data" (OuterVolumeSpecName: "config-data") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.600819 5033 scope.go:117] "RemoveContainer" containerID="7e4658f3b60f3d3b8b5b4f93069cf2b1007b5fe4254ff40ba00262959d1a5943" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.620090 5033 scope.go:117] "RemoveContainer" containerID="f0f03ff28cb2fd6dfd16717139933800e552a46a34678c7bbc6c1fd106625c59" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.642272 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "065ae47e-dafd-4588-b3c6-7340223988ce" (UID: "065ae47e-dafd-4588-b3c6-7340223988ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.668574 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.668612 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/065ae47e-dafd-4588-b3c6-7340223988ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.678282 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" path="/var/lib/kubelet/pods/f9761bae-b27d-49c4-8ad3-f8001ecc3111/volumes" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.692635 5033 scope.go:117] "RemoveContainer" containerID="ede2b6e3ba941aac764534319359ac39ff2a4d4be8954482a1ba7b44cbdfeb2a" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.849054 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod065ae47e_dafd_4588_b3c6_7340223988ce.slice/crio-1957a2906401f7aefc3e79d460fd3724512b00c72be7961446a87bcb6f53a195\": RecentStats: unable to find data in memory cache]" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.861606 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.870432 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.886940 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887373 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887390 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887400 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887406 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887424 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="sg-core" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887432 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="sg-core" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887466 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-central-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887473 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-central-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887485 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker-log" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887492 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker-log" Mar 19 19:16:48 crc kubenswrapper[5033]: E0319 19:16:48.887516 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-notification-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887523 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-notification-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887717 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="sg-core" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.887732 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-central-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.909989 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker-log" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.910044 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="proxy-httpd" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.910067 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9761bae-b27d-49c4-8ad3-f8001ecc3111" containerName="barbican-worker" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.910076 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" containerName="ceilometer-notification-agent" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.915186 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.915383 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-log" containerID="cri-o://674648697e22b3221647833afbe0ad84f56178bab693cda8b10a513986c21349" gracePeriod=30 Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.915542 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.916553 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-httpd" containerID="cri-o://63efe028fc693f0085f904b7ee103e19c28c6f8ae305f652d0a11d65dc2237db" gracePeriod=30 Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.920044 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.920240 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:16:48 crc kubenswrapper[5033]: I0319 19:16:48.942441 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075179 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075527 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075626 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lghv\" (UniqueName: \"kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075649 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.075759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.178404 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179121 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179198 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179261 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179280 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lghv\" (UniqueName: \"kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.179303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.183025 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.183633 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.184145 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.186249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.196364 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.196664 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.204210 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lghv\" (UniqueName: \"kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv\") pod \"ceilometer-0\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.244255 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.535843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d74595b67-9drnj" event={"ID":"66364c56-ef15-437e-8508-2f7b2c4471f8","Type":"ContainerStarted","Data":"01db791f863c30006ed148af572fb04ba3525a78c2b65fd40fb092e2766fb18c"} Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.536121 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d74595b67-9drnj" event={"ID":"66364c56-ef15-437e-8508-2f7b2c4471f8","Type":"ContainerStarted","Data":"04e21359814f4c7e77c5f219103fbaaf5e475dd0880e551102c618a4796c2c18"} Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.536131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d74595b67-9drnj" event={"ID":"66364c56-ef15-437e-8508-2f7b2c4471f8","Type":"ContainerStarted","Data":"410bd6b5ca36214170edc7bf0fb64428df4ac7e951166d81be63ac73bd9dc372"} Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.536178 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.536208 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.557065 5033 generic.go:334] "Generic (PLEG): container finished" podID="cc690206-0a0d-487f-942a-eaacf0014165" containerID="674648697e22b3221647833afbe0ad84f56178bab693cda8b10a513986c21349" exitCode=143 Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.557128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerDied","Data":"674648697e22b3221647833afbe0ad84f56178bab693cda8b10a513986c21349"} Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.559387 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6d74595b67-9drnj" podStartSLOduration=7.559370526 podStartE2EDuration="7.559370526s" podCreationTimestamp="2026-03-19 19:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:49.557055381 +0000 UTC m=+1219.662085230" watchObservedRunningTime="2026-03-19 19:16:49.559370526 +0000 UTC m=+1219.664400375" Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.649235 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.649940 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-httpd" containerID="cri-o://ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006" gracePeriod=30 Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.649882 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-log" containerID="cri-o://4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0" gracePeriod=30 Mar 19 19:16:49 crc kubenswrapper[5033]: I0319 19:16:49.710633 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:50 crc kubenswrapper[5033]: I0319 19:16:50.585576 5033 generic.go:334] "Generic (PLEG): container finished" podID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerID="4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0" exitCode=143 Mar 19 19:16:50 crc kubenswrapper[5033]: I0319 19:16:50.585801 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerDied","Data":"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0"} Mar 19 19:16:50 crc kubenswrapper[5033]: I0319 19:16:50.587077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerStarted","Data":"d594db02b31cd9b6d3019b73af42eaf490d2cc5216b1d3fb6d2d2efc69b73b47"} Mar 19 19:16:50 crc kubenswrapper[5033]: I0319 19:16:50.634651 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065ae47e-dafd-4588-b3c6-7340223988ce" path="/var/lib/kubelet/pods/065ae47e-dafd-4588-b3c6-7340223988ce/volumes" Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.087510 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.540276 5033 scope.go:117] "RemoveContainer" containerID="3164491618e21e4463d0c4200061cbb3c985e29e5355e7be7299b32234df048f" Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.626532 5033 generic.go:334] "Generic (PLEG): container finished" podID="be100139-e76b-4549-a7a7-652dcd18354b" containerID="a36446b46b023b1c81c31f698eef52d286ae4ecabe9d883ac720569f66d7cb6e" exitCode=0 Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.626618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerDied","Data":"a36446b46b023b1c81c31f698eef52d286ae4ecabe9d883ac720569f66d7cb6e"} Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.629153 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerStarted","Data":"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f"} Mar 19 19:16:51 crc kubenswrapper[5033]: I0319 19:16:51.629201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerStarted","Data":"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c"} Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.082498 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.242257 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config\") pod \"be100139-e76b-4549-a7a7-652dcd18354b\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.242325 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config\") pod \"be100139-e76b-4549-a7a7-652dcd18354b\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.242371 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs\") pod \"be100139-e76b-4549-a7a7-652dcd18354b\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.242407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skwn2\" (UniqueName: \"kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2\") pod \"be100139-e76b-4549-a7a7-652dcd18354b\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.242446 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle\") pod \"be100139-e76b-4549-a7a7-652dcd18354b\" (UID: \"be100139-e76b-4549-a7a7-652dcd18354b\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.256050 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2" (OuterVolumeSpecName: "kube-api-access-skwn2") pod "be100139-e76b-4549-a7a7-652dcd18354b" (UID: "be100139-e76b-4549-a7a7-652dcd18354b"). InnerVolumeSpecName "kube-api-access-skwn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.262474 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "be100139-e76b-4549-a7a7-652dcd18354b" (UID: "be100139-e76b-4549-a7a7-652dcd18354b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.317517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be100139-e76b-4549-a7a7-652dcd18354b" (UID: "be100139-e76b-4549-a7a7-652dcd18354b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.345110 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.345139 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skwn2\" (UniqueName: \"kubernetes.io/projected/be100139-e76b-4549-a7a7-652dcd18354b-kube-api-access-skwn2\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.345152 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.348749 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "be100139-e76b-4549-a7a7-652dcd18354b" (UID: "be100139-e76b-4549-a7a7-652dcd18354b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.363638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config" (OuterVolumeSpecName: "config") pod "be100139-e76b-4549-a7a7-652dcd18354b" (UID: "be100139-e76b-4549-a7a7-652dcd18354b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.458566 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.458603 5033 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be100139-e76b-4549-a7a7-652dcd18354b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.663850 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f8fddc48-z2hb6" event={"ID":"be100139-e76b-4549-a7a7-652dcd18354b","Type":"ContainerDied","Data":"93c828d322c8a212d04c5536040b0a3973f99b4bd779522a0be8660e5f72c2c0"} Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.663922 5033 scope.go:117] "RemoveContainer" containerID="43b6cf2576b94a1750263249bcfadb96a5fda6a6440e91e30e3fdeb466699612" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.664093 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f8fddc48-z2hb6" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.674785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerStarted","Data":"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543"} Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.677188 5033 generic.go:334] "Generic (PLEG): container finished" podID="cc690206-0a0d-487f-942a-eaacf0014165" containerID="63efe028fc693f0085f904b7ee103e19c28c6f8ae305f652d0a11d65dc2237db" exitCode=0 Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.677233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerDied","Data":"63efe028fc693f0085f904b7ee103e19c28c6f8ae305f652d0a11d65dc2237db"} Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.677265 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cc690206-0a0d-487f-942a-eaacf0014165","Type":"ContainerDied","Data":"d2e833fd9040ecc087ea93fe6c36b7b5dfc6eb1ad50f9828ab30764b39640540"} Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.677282 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e833fd9040ecc087ea93fe6c36b7b5dfc6eb1ad50f9828ab30764b39640540" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.716286 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.729877 5033 scope.go:117] "RemoveContainer" containerID="a36446b46b023b1c81c31f698eef52d286ae4ecabe9d883ac720569f66d7cb6e" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.745247 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.761564 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76f8fddc48-z2hb6"] Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.865662 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnsxv\" (UniqueName: \"kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.865995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866055 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866092 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866243 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866384 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.866421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle\") pod \"cc690206-0a0d-487f-942a-eaacf0014165\" (UID: \"cc690206-0a0d-487f-942a-eaacf0014165\") " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.868730 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs" (OuterVolumeSpecName: "logs") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.868870 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.870600 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts" (OuterVolumeSpecName: "scripts") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.880648 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv" (OuterVolumeSpecName: "kube-api-access-bnsxv") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "kube-api-access-bnsxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.888943 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c" (OuterVolumeSpecName: "glance") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.907164 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.932160 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.945604 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data" (OuterVolumeSpecName: "config-data") pod "cc690206-0a0d-487f-942a-eaacf0014165" (UID: "cc690206-0a0d-487f-942a-eaacf0014165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970112 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970141 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970152 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970163 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnsxv\" (UniqueName: \"kubernetes.io/projected/cc690206-0a0d-487f-942a-eaacf0014165-kube-api-access-bnsxv\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970172 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970182 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc690206-0a0d-487f-942a-eaacf0014165-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970191 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc690206-0a0d-487f-942a-eaacf0014165-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.970226 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") on node \"crc\" " Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.998172 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:16:52 crc kubenswrapper[5033]: I0319 19:16:52.998322 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c") on node "crc" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.071968 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.286588 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.479205 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.479297 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.479762 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480250 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7vx9\" (UniqueName: \"kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs" (OuterVolumeSpecName: "logs") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480506 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.480987 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs\") pod \"42e9d774-7140-4b49-bcbf-9a73a5814cda\" (UID: \"42e9d774-7140-4b49-bcbf-9a73a5814cda\") " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.481047 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.482211 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.482311 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e9d774-7140-4b49-bcbf-9a73a5814cda-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.488665 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9" (OuterVolumeSpecName: "kube-api-access-m7vx9") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "kube-api-access-m7vx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.498369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts" (OuterVolumeSpecName: "scripts") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.507419 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854" (OuterVolumeSpecName: "glance") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.523754 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.561388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.565684 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data" (OuterVolumeSpecName: "config-data") pod "42e9d774-7140-4b49-bcbf-9a73a5814cda" (UID: "42e9d774-7140-4b49-bcbf-9a73a5814cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584655 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7vx9\" (UniqueName: \"kubernetes.io/projected/42e9d774-7140-4b49-bcbf-9a73a5814cda-kube-api-access-m7vx9\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584689 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584699 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584707 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584716 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e9d774-7140-4b49-bcbf-9a73a5814cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.584745 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") on node \"crc\" " Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.609347 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.609522 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854") on node "crc" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687745 5033 generic.go:334] "Generic (PLEG): container finished" podID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerID="ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006" exitCode=0 Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687790 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687830 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerDied","Data":"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006"} Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42e9d774-7140-4b49-bcbf-9a73a5814cda","Type":"ContainerDied","Data":"6ef793be4722adc9897ac5d585d60c3ddb3ad94bd2d25d8c77f8dac09950a1de"} Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687892 5033 scope.go:117] "RemoveContainer" containerID="ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.687842 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.732307 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.747343 5033 scope.go:117] "RemoveContainer" containerID="4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.751714 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.772325 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.779879 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.795426 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.795827 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.795844 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.795866 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-api" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.795873 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-api" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.795897 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.795903 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.796035 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796045 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.796056 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796064 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.796082 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796090 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796273 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-api" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796296 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796310 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be100139-e76b-4549-a7a7-652dcd18354b" containerName="neutron-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796324 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc690206-0a0d-487f-942a-eaacf0014165" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796333 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-httpd" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.796351 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" containerName="glance-log" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.797858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.802249 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.802664 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.802744 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.802740 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xnfth" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.821901 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.827885 5033 scope.go:117] "RemoveContainer" containerID="ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.828355 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006\": container with ID starting with ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006 not found: ID does not exist" containerID="ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.828583 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006"} err="failed to get container status \"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006\": rpc error: code = NotFound desc = could not find container \"ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006\": container with ID starting with ed04f40d60075c0200e2f7022b558c845ed77b6f62c4cafa16db3d63ceb86006 not found: ID does not exist" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.828673 5033 scope.go:117] "RemoveContainer" containerID="4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0" Mar 19 19:16:53 crc kubenswrapper[5033]: E0319 19:16:53.828909 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0\": container with ID starting with 4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0 not found: ID does not exist" containerID="4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.828990 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0"} err="failed to get container status \"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0\": rpc error: code = NotFound desc = could not find container \"4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0\": container with ID starting with 4dc2ee9bdc0ce0497021a68e6b32637e578a8e747503b4a1daddf3010f7276f0 not found: ID does not exist" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.829519 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.834256 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.837372 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.838324 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.838644 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993370 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-config-data\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-logs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993439 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hslpb\" (UniqueName: \"kubernetes.io/projected/94a07e0d-e86b-4f00-9214-b99ff1484630-kube-api-access-hslpb\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993485 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993503 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2nh\" (UniqueName: \"kubernetes.io/projected/d033188c-9f49-46fb-8650-b579f9b4a6ea-kube-api-access-kv2nh\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993521 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993550 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993600 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993617 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993650 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993670 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993697 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993729 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-scripts\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993757 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:53 crc kubenswrapper[5033]: I0319 19:16:53.993772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-config-data\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095481 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-logs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095509 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hslpb\" (UniqueName: \"kubernetes.io/projected/94a07e0d-e86b-4f00-9214-b99ff1484630-kube-api-access-hslpb\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095550 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095567 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2nh\" (UniqueName: \"kubernetes.io/projected/d033188c-9f49-46fb-8650-b579f9b4a6ea-kube-api-access-kv2nh\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095653 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095690 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095729 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095753 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095778 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095815 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-scripts\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.095865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.097361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.097593 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a07e0d-e86b-4f00-9214-b99ff1484630-logs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.099719 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.100365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033188c-9f49-46fb-8650-b579f9b4a6ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.102041 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.102779 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.102955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.103174 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-scripts\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.104345 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.104373 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ced0ba0235b23bf9966e6eaf15351f0da0b1c8b6be463535cbc385832171ea67/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.108110 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.108132 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e28e119150301283e822b5e6f5ae17495e606ad12e9d94c8819dbad244f5550/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.111438 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.112030 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.112202 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a07e0d-e86b-4f00-9214-b99ff1484630-config-data\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.112762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d033188c-9f49-46fb-8650-b579f9b4a6ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.113839 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hslpb\" (UniqueName: \"kubernetes.io/projected/94a07e0d-e86b-4f00-9214-b99ff1484630-kube-api-access-hslpb\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.117821 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2nh\" (UniqueName: \"kubernetes.io/projected/d033188c-9f49-46fb-8650-b579f9b4a6ea-kube-api-access-kv2nh\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.141479 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f50e2d-2b2e-4b4d-8883-5ce384207854\") pod \"glance-default-internal-api-0\" (UID: \"d033188c-9f49-46fb-8650-b579f9b4a6ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.156671 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.164403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c67c318-0dc2-4be8-af40-6b7c1651543c\") pod \"glance-default-external-api-0\" (UID: \"94a07e0d-e86b-4f00-9214-b99ff1484630\") " pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.428731 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.642703 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e9d774-7140-4b49-bcbf-9a73a5814cda" path="/var/lib/kubelet/pods/42e9d774-7140-4b49-bcbf-9a73a5814cda/volumes" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.643333 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be100139-e76b-4549-a7a7-652dcd18354b" path="/var/lib/kubelet/pods/be100139-e76b-4549-a7a7-652dcd18354b/volumes" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.643950 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc690206-0a0d-487f-942a-eaacf0014165" path="/var/lib/kubelet/pods/cc690206-0a0d-487f-942a-eaacf0014165/volumes" Mar 19 19:16:54 crc kubenswrapper[5033]: I0319 19:16:54.707979 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.166171 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:16:55 crc kubenswrapper[5033]: W0319 19:16:55.186151 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a07e0d_e86b_4f00_9214_b99ff1484630.slice/crio-e400bf9316e6b01cdcc3ff62fe8620993bc8c40b6b67be3b78248caf02bf5a10 WatchSource:0}: Error finding container e400bf9316e6b01cdcc3ff62fe8620993bc8c40b6b67be3b78248caf02bf5a10: Status 404 returned error can't find the container with id e400bf9316e6b01cdcc3ff62fe8620993bc8c40b6b67be3b78248caf02bf5a10 Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.760808 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerStarted","Data":"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5"} Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.761110 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-central-agent" containerID="cri-o://f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c" gracePeriod=30 Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.761219 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.761293 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="proxy-httpd" containerID="cri-o://f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5" gracePeriod=30 Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.761343 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="sg-core" containerID="cri-o://d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543" gracePeriod=30 Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.761375 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-notification-agent" containerID="cri-o://d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f" gracePeriod=30 Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.774341 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94a07e0d-e86b-4f00-9214-b99ff1484630","Type":"ContainerStarted","Data":"e400bf9316e6b01cdcc3ff62fe8620993bc8c40b6b67be3b78248caf02bf5a10"} Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.779413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d033188c-9f49-46fb-8650-b579f9b4a6ea","Type":"ContainerStarted","Data":"50dcc06cf9a9c1fbb4dc975d16ffbadba78c69bcc4f19aebf5ca827125d22439"} Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.779442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d033188c-9f49-46fb-8650-b579f9b4a6ea","Type":"ContainerStarted","Data":"03dcbc5f4ed932e639d7a93e11f525c42917aa174fbacdc756f7ea8678ee4d0f"} Mar 19 19:16:55 crc kubenswrapper[5033]: I0319 19:16:55.787767 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.02044211 podStartE2EDuration="7.787753281s" podCreationTimestamp="2026-03-19 19:16:48 +0000 UTC" firstStartedPulling="2026-03-19 19:16:49.728441074 +0000 UTC m=+1219.833470923" lastFinishedPulling="2026-03-19 19:16:54.495752245 +0000 UTC m=+1224.600782094" observedRunningTime="2026-03-19 19:16:55.782576715 +0000 UTC m=+1225.887606564" watchObservedRunningTime="2026-03-19 19:16:55.787753281 +0000 UTC m=+1225.892783130" Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791048 5033 generic.go:334] "Generic (PLEG): container finished" podID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerID="f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5" exitCode=0 Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791417 5033 generic.go:334] "Generic (PLEG): container finished" podID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerID="d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543" exitCode=2 Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791435 5033 generic.go:334] "Generic (PLEG): container finished" podID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerID="d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f" exitCode=0 Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791130 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerDied","Data":"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerDied","Data":"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.791512 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerDied","Data":"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.793187 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94a07e0d-e86b-4f00-9214-b99ff1484630","Type":"ContainerStarted","Data":"93d3baa391c64d18b41c75bff9ede3c5cfc9d44a70f1c1b2c645f5aab562b137"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.793212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94a07e0d-e86b-4f00-9214-b99ff1484630","Type":"ContainerStarted","Data":"fd8b35ba91bd7eaa3fdf7c12fc01a5e389f65d3e6ed289fff64edced2411d2f6"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.796171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d033188c-9f49-46fb-8650-b579f9b4a6ea","Type":"ContainerStarted","Data":"3ea7061c816c9806af73f934b8195cc39067b8aa85a50ebf59ef98bf797ec946"} Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.824123 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.824101826 podStartE2EDuration="3.824101826s" podCreationTimestamp="2026-03-19 19:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:56.810801151 +0000 UTC m=+1226.915831010" watchObservedRunningTime="2026-03-19 19:16:56.824101826 +0000 UTC m=+1226.929131665" Mar 19 19:16:56 crc kubenswrapper[5033]: I0319 19:16:56.850261 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.850237783 podStartE2EDuration="3.850237783s" podCreationTimestamp="2026-03-19 19:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:56.842635158 +0000 UTC m=+1226.947665017" watchObservedRunningTime="2026-03-19 19:16:56.850237783 +0000 UTC m=+1226.955267632" Mar 19 19:16:57 crc kubenswrapper[5033]: I0319 19:16:57.990195 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:16:57 crc kubenswrapper[5033]: I0319 19:16:57.990691 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d74595b67-9drnj" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.447193 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526221 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lghv\" (UniqueName: \"kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526754 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526807 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.526926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml\") pod \"a9f27f53-975c-4b9a-a051-a67c718b871c\" (UID: \"a9f27f53-975c-4b9a-a051-a67c718b871c\") " Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.527821 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.528149 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.533009 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv" (OuterVolumeSpecName: "kube-api-access-6lghv") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "kube-api-access-6lghv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.547659 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts" (OuterVolumeSpecName: "scripts") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.570122 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.616914 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629362 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629395 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629406 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lghv\" (UniqueName: \"kubernetes.io/projected/a9f27f53-975c-4b9a-a051-a67c718b871c-kube-api-access-6lghv\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629417 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629425 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.629434 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9f27f53-975c-4b9a-a051-a67c718b871c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.638437 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data" (OuterVolumeSpecName: "config-data") pod "a9f27f53-975c-4b9a-a051-a67c718b871c" (UID: "a9f27f53-975c-4b9a-a051-a67c718b871c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.731317 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9f27f53-975c-4b9a-a051-a67c718b871c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.835160 5033 generic.go:334] "Generic (PLEG): container finished" podID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerID="f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c" exitCode=0 Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.835200 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerDied","Data":"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c"} Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.835227 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9f27f53-975c-4b9a-a051-a67c718b871c","Type":"ContainerDied","Data":"d594db02b31cd9b6d3019b73af42eaf490d2cc5216b1d3fb6d2d2efc69b73b47"} Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.835244 5033 scope.go:117] "RemoveContainer" containerID="f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.835238 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.865130 5033 scope.go:117] "RemoveContainer" containerID="d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.901381 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.923180 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.928965 5033 scope.go:117] "RemoveContainer" containerID="d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.945697 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:00 crc kubenswrapper[5033]: E0319 19:17:00.946120 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-central-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946137 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-central-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: E0319 19:17:00.946154 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-notification-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946161 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-notification-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: E0319 19:17:00.946176 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="sg-core" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946182 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="sg-core" Mar 19 19:17:00 crc kubenswrapper[5033]: E0319 19:17:00.946196 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="proxy-httpd" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="proxy-httpd" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946379 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-notification-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946391 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="proxy-httpd" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946403 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="sg-core" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.946419 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" containerName="ceilometer-central-agent" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.948156 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.953116 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.955434 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.957276 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:17:00 crc kubenswrapper[5033]: I0319 19:17:00.977368 5033 scope.go:117] "RemoveContainer" containerID="f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.037120 5033 scope.go:117] "RemoveContainer" containerID="f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5" Mar 19 19:17:01 crc kubenswrapper[5033]: E0319 19:17:01.037912 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5\": container with ID starting with f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5 not found: ID does not exist" containerID="f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.037958 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5"} err="failed to get container status \"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5\": rpc error: code = NotFound desc = could not find container \"f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5\": container with ID starting with f503fa25aba084d831e62d0c33361900790d097b0d56949171ed8d8337db59f5 not found: ID does not exist" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.037986 5033 scope.go:117] "RemoveContainer" containerID="d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543" Mar 19 19:17:01 crc kubenswrapper[5033]: E0319 19:17:01.038423 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543\": container with ID starting with d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543 not found: ID does not exist" containerID="d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038467 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543"} err="failed to get container status \"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543\": rpc error: code = NotFound desc = could not find container \"d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543\": container with ID starting with d6d1ce6f0a694f58b492a1f742516a01c3276b6c8e97d96219ef3db294f5a543 not found: ID does not exist" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038488 5033 scope.go:117] "RemoveContainer" containerID="d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038595 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: E0319 19:17:01.038779 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f\": container with ID starting with d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f not found: ID does not exist" containerID="d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038805 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f"} err="failed to get container status \"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f\": rpc error: code = NotFound desc = could not find container \"d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f\": container with ID starting with d2d2063d6068faf7c360bf0548ea71c3df7d5fe0b7f47f0a218cacba0429537f not found: ID does not exist" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038820 5033 scope.go:117] "RemoveContainer" containerID="f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.038777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039014 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: E0319 19:17:01.039077 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c\": container with ID starting with f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c not found: ID does not exist" containerID="f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039108 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c"} err="failed to get container status \"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c\": rpc error: code = NotFound desc = could not find container \"f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c\": container with ID starting with f7d82ee5e5181ed90c8c743e207b39280d53c2a4748dabeebbf3e5bf4abcec3c not found: ID does not exist" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039084 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039368 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5hm\" (UniqueName: \"kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039479 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.039584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142231 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142306 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5hm\" (UniqueName: \"kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142340 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.142570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.143034 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.143082 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.148428 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.150388 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.157883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.161924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5hm\" (UniqueName: \"kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.170897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts\") pod \"ceilometer-0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.263525 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.713337 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:01 crc kubenswrapper[5033]: I0319 19:17:01.845705 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerStarted","Data":"5afa978066c613a3fd86757ab155b1cee242ecc5f2076d725c4163f991d0403a"} Mar 19 19:17:02 crc kubenswrapper[5033]: I0319 19:17:02.631087 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f27f53-975c-4b9a-a051-a67c718b871c" path="/var/lib/kubelet/pods/a9f27f53-975c-4b9a-a051-a67c718b871c/volumes" Mar 19 19:17:02 crc kubenswrapper[5033]: I0319 19:17:02.857608 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerStarted","Data":"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1"} Mar 19 19:17:03 crc kubenswrapper[5033]: I0319 19:17:03.871416 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerStarted","Data":"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b"} Mar 19 19:17:03 crc kubenswrapper[5033]: I0319 19:17:03.872669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerStarted","Data":"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08"} Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.157831 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.157888 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.197246 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.227422 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.429109 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.429416 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.467230 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.489299 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.881051 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.881086 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.881098 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:04 crc kubenswrapper[5033]: I0319 19:17:04.881331 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:06 crc kubenswrapper[5033]: I0319 19:17:06.897003 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:17:06 crc kubenswrapper[5033]: I0319 19:17:06.897248 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:17:06 crc kubenswrapper[5033]: I0319 19:17:06.897003 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:17:06 crc kubenswrapper[5033]: I0319 19:17:06.897358 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.227823 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.229218 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.391266 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.392021 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.909246 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerStarted","Data":"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1"} Mar 19 19:17:07 crc kubenswrapper[5033]: I0319 19:17:07.931237 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.57236649 podStartE2EDuration="7.931220553s" podCreationTimestamp="2026-03-19 19:17:00 +0000 UTC" firstStartedPulling="2026-03-19 19:17:01.722310088 +0000 UTC m=+1231.827339937" lastFinishedPulling="2026-03-19 19:17:07.081164151 +0000 UTC m=+1237.186194000" observedRunningTime="2026-03-19 19:17:07.92829372 +0000 UTC m=+1238.033323559" watchObservedRunningTime="2026-03-19 19:17:07.931220553 +0000 UTC m=+1238.036250392" Mar 19 19:17:08 crc kubenswrapper[5033]: I0319 19:17:08.917766 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.140711 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.579808 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.934821 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="sg-core" containerID="cri-o://5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b" gracePeriod=30 Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.934847 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="proxy-httpd" containerID="cri-o://1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1" gracePeriod=30 Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.934890 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-notification-agent" containerID="cri-o://ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08" gracePeriod=30 Mar 19 19:17:10 crc kubenswrapper[5033]: I0319 19:17:10.934957 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-central-agent" containerID="cri-o://dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1" gracePeriod=30 Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945406 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerID="1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1" exitCode=0 Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945723 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerID="5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b" exitCode=2 Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945484 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerDied","Data":"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1"} Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerDied","Data":"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b"} Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerDied","Data":"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08"} Mar 19 19:17:11 crc kubenswrapper[5033]: I0319 19:17:11.945737 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerID="ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08" exitCode=0 Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.048117 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2z6h7"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.049886 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.069760 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2z6h7"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.135823 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.135866 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6kr\" (UniqueName: \"kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.145479 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nll7n"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.146848 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.159150 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nll7n"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.238117 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.238251 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.238275 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6kr\" (UniqueName: \"kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.238314 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xb8\" (UniqueName: \"kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.238994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.249609 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8350-account-create-update-7bhxl"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.252266 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.254793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.261183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6kr\" (UniqueName: \"kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr\") pod \"nova-api-db-create-2z6h7\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.265289 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8350-account-create-update-7bhxl"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.340339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfh9m\" (UniqueName: \"kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.340395 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xb8\" (UniqueName: \"kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.340503 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.340542 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.341265 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.362684 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d5pz8"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.363922 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.373008 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.386206 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d5pz8"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.389295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xb8\" (UniqueName: \"kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8\") pod \"nova-cell0-db-create-nll7n\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.442178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfh9m\" (UniqueName: \"kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.442283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.442373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49czk\" (UniqueName: \"kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.442408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.443518 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.465433 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-08d0-account-create-update-5mrwk"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.466938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfh9m\" (UniqueName: \"kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m\") pod \"nova-api-8350-account-create-update-7bhxl\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.467066 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.467990 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.471054 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.501503 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-08d0-account-create-update-5mrwk"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.551140 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.551187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.551229 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clzs\" (UniqueName: \"kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.551406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49czk\" (UniqueName: \"kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.552003 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.574776 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49czk\" (UniqueName: \"kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk\") pod \"nova-cell1-db-create-d5pz8\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.646015 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.654114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.654191 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clzs\" (UniqueName: \"kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.656475 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.663710 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9863-account-create-update-hwtsv"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.665350 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.683600 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9863-account-create-update-hwtsv"] Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.686748 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.687163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clzs\" (UniqueName: \"kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs\") pod \"nova-cell0-08d0-account-create-update-5mrwk\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.755778 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.755919 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8srs\" (UniqueName: \"kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.766482 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.860156 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8srs\" (UniqueName: \"kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.860316 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.861126 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.883837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8srs\" (UniqueName: \"kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs\") pod \"nova-cell1-9863-account-create-update-hwtsv\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:16 crc kubenswrapper[5033]: I0319 19:17:16.913011 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.003863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.011202 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2z6h7"] Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.169485 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nll7n"] Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.634025 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8350-account-create-update-7bhxl"] Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.645009 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d5pz8"] Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.785910 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-08d0-account-create-update-5mrwk"] Mar 19 19:17:17 crc kubenswrapper[5033]: W0319 19:17:17.786805 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83e6b50_dafa_4bb0_96d1_89ad756e947a.slice/crio-4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0 WatchSource:0}: Error finding container 4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0: Status 404 returned error can't find the container with id 4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0 Mar 19 19:17:17 crc kubenswrapper[5033]: I0319 19:17:17.864182 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9863-account-create-update-hwtsv"] Mar 19 19:17:17 crc kubenswrapper[5033]: W0319 19:17:17.869387 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f78ec8_661f_4f3b_88b6_cc687515ba76.slice/crio-141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9 WatchSource:0}: Error finding container 141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9: Status 404 returned error can't find the container with id 141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9 Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.025820 5033 generic.go:334] "Generic (PLEG): container finished" podID="6d670e98-83a9-43c0-b723-caf42651cfcc" containerID="534cacdb4a77f3f82b88746dcdec6a48bed1e5408eab3027018d3455072baf70" exitCode=0 Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.025914 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nll7n" event={"ID":"6d670e98-83a9-43c0-b723-caf42651cfcc","Type":"ContainerDied","Data":"534cacdb4a77f3f82b88746dcdec6a48bed1e5408eab3027018d3455072baf70"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.025946 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nll7n" event={"ID":"6d670e98-83a9-43c0-b723-caf42651cfcc","Type":"ContainerStarted","Data":"17f66a3d4a2711236c66f6a3f16f81b2bfef7224c7387c741e6494bb27976f2e"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.027276 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" event={"ID":"49f78ec8-661f-4f3b-88b6-cc687515ba76","Type":"ContainerStarted","Data":"141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.028491 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" event={"ID":"b83e6b50-dafa-4bb0-96d1-89ad756e947a","Type":"ContainerStarted","Data":"4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.029820 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8350-account-create-update-7bhxl" event={"ID":"f469bfad-5e99-414b-b44c-18c5fde5a96b","Type":"ContainerStarted","Data":"c8de08b05f8963e58049870bb4120136e569f9b34b6d5ec3743f988013cadf32"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.029844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8350-account-create-update-7bhxl" event={"ID":"f469bfad-5e99-414b-b44c-18c5fde5a96b","Type":"ContainerStarted","Data":"885e972a55f386719766e96b404908662f122302403ae3beea9f47e0136bfe54"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.032069 5033 generic.go:334] "Generic (PLEG): container finished" podID="14f7fa82-563b-4906-8e47-0ef910e3993a" containerID="9747160d44c1d6cba806d0c6f3ba0073bf6fda85fb15e8ecae3042aa0cd43364" exitCode=0 Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.032115 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2z6h7" event={"ID":"14f7fa82-563b-4906-8e47-0ef910e3993a","Type":"ContainerDied","Data":"9747160d44c1d6cba806d0c6f3ba0073bf6fda85fb15e8ecae3042aa0cd43364"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.032131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2z6h7" event={"ID":"14f7fa82-563b-4906-8e47-0ef910e3993a","Type":"ContainerStarted","Data":"795a3b8a8c935771f15632426d4e0aa1c75b5e348da7bc8d4603ba0668c3183a"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.034636 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d5pz8" event={"ID":"0ac10a03-94cf-460b-9d47-305d6ffaa16a","Type":"ContainerStarted","Data":"3cc6fba13cbb2ebb459b195296ab86e2695c67baf4026c237fcf55201ab24a8b"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.034667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d5pz8" event={"ID":"0ac10a03-94cf-460b-9d47-305d6ffaa16a","Type":"ContainerStarted","Data":"4c946c9248b628aca2208fe096c8961a69c729ad2df8090e9abb217dcd9c9880"} Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.112628 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8350-account-create-update-7bhxl" podStartSLOduration=2.112610104 podStartE2EDuration="2.112610104s" podCreationTimestamp="2026-03-19 19:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:18.088952277 +0000 UTC m=+1248.193982126" watchObservedRunningTime="2026-03-19 19:17:18.112610104 +0000 UTC m=+1248.217639953" Mar 19 19:17:18 crc kubenswrapper[5033]: I0319 19:17:18.114018 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-d5pz8" podStartSLOduration=2.113991003 podStartE2EDuration="2.113991003s" podCreationTimestamp="2026-03-19 19:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:18.107172331 +0000 UTC m=+1248.212202180" watchObservedRunningTime="2026-03-19 19:17:18.113991003 +0000 UTC m=+1248.219020852" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.044564 5033 generic.go:334] "Generic (PLEG): container finished" podID="b83e6b50-dafa-4bb0-96d1-89ad756e947a" containerID="f796b21829d82e98c088eeaabacbafbe9b9bca08ba4d8b872e2a9ff2635d3e7a" exitCode=0 Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.044632 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" event={"ID":"b83e6b50-dafa-4bb0-96d1-89ad756e947a","Type":"ContainerDied","Data":"f796b21829d82e98c088eeaabacbafbe9b9bca08ba4d8b872e2a9ff2635d3e7a"} Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.047427 5033 generic.go:334] "Generic (PLEG): container finished" podID="f469bfad-5e99-414b-b44c-18c5fde5a96b" containerID="c8de08b05f8963e58049870bb4120136e569f9b34b6d5ec3743f988013cadf32" exitCode=0 Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.047504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8350-account-create-update-7bhxl" event={"ID":"f469bfad-5e99-414b-b44c-18c5fde5a96b","Type":"ContainerDied","Data":"c8de08b05f8963e58049870bb4120136e569f9b34b6d5ec3743f988013cadf32"} Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.049423 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ac10a03-94cf-460b-9d47-305d6ffaa16a" containerID="3cc6fba13cbb2ebb459b195296ab86e2695c67baf4026c237fcf55201ab24a8b" exitCode=0 Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.049582 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d5pz8" event={"ID":"0ac10a03-94cf-460b-9d47-305d6ffaa16a","Type":"ContainerDied","Data":"3cc6fba13cbb2ebb459b195296ab86e2695c67baf4026c237fcf55201ab24a8b"} Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.051730 5033 generic.go:334] "Generic (PLEG): container finished" podID="49f78ec8-661f-4f3b-88b6-cc687515ba76" containerID="361957c22d69df4bf08a7ff9336fe6772f71607f26c2fe4b9344eba8458ddf19" exitCode=0 Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.051970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" event={"ID":"49f78ec8-661f-4f3b-88b6-cc687515ba76","Type":"ContainerDied","Data":"361957c22d69df4bf08a7ff9336fe6772f71607f26c2fe4b9344eba8458ddf19"} Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.726706 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.765108 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6kr\" (UniqueName: \"kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr\") pod \"14f7fa82-563b-4906-8e47-0ef910e3993a\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.765316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts\") pod \"14f7fa82-563b-4906-8e47-0ef910e3993a\" (UID: \"14f7fa82-563b-4906-8e47-0ef910e3993a\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.778334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14f7fa82-563b-4906-8e47-0ef910e3993a" (UID: "14f7fa82-563b-4906-8e47-0ef910e3993a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.830209 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr" (OuterVolumeSpecName: "kube-api-access-ms6kr") pod "14f7fa82-563b-4906-8e47-0ef910e3993a" (UID: "14f7fa82-563b-4906-8e47-0ef910e3993a"). InnerVolumeSpecName "kube-api-access-ms6kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.868360 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14f7fa82-563b-4906-8e47-0ef910e3993a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.868603 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6kr\" (UniqueName: \"kubernetes.io/projected/14f7fa82-563b-4906-8e47-0ef910e3993a-kube-api-access-ms6kr\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.926581 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.933095 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970727 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970807 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970893 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970928 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts\") pod \"6d670e98-83a9-43c0-b723-caf42651cfcc\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970950 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5xb8\" (UniqueName: \"kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8\") pod \"6d670e98-83a9-43c0-b723-caf42651cfcc\" (UID: \"6d670e98-83a9-43c0-b723-caf42651cfcc\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.970988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.971003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc5hm\" (UniqueName: \"kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.971044 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.971141 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle\") pod \"e2516837-e89c-4f10-a494-cfbedfd917f0\" (UID: \"e2516837-e89c-4f10-a494-cfbedfd917f0\") " Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.971587 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d670e98-83a9-43c0-b723-caf42651cfcc" (UID: "6d670e98-83a9-43c0-b723-caf42651cfcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.975081 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.975978 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.977172 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm" (OuterVolumeSpecName: "kube-api-access-cc5hm") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "kube-api-access-cc5hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.978560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts" (OuterVolumeSpecName: "scripts") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:19 crc kubenswrapper[5033]: I0319 19:17:19.978891 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8" (OuterVolumeSpecName: "kube-api-access-c5xb8") pod "6d670e98-83a9-43c0-b723-caf42651cfcc" (UID: "6d670e98-83a9-43c0-b723-caf42651cfcc"). InnerVolumeSpecName "kube-api-access-c5xb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.009579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.052368 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.066687 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nll7n" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.067226 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nll7n" event={"ID":"6d670e98-83a9-43c0-b723-caf42651cfcc","Type":"ContainerDied","Data":"17f66a3d4a2711236c66f6a3f16f81b2bfef7224c7387c741e6494bb27976f2e"} Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.067328 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f66a3d4a2711236c66f6a3f16f81b2bfef7224c7387c741e6494bb27976f2e" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072514 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072550 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072567 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072581 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d670e98-83a9-43c0-b723-caf42651cfcc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072594 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5xb8\" (UniqueName: \"kubernetes.io/projected/6d670e98-83a9-43c0-b723-caf42651cfcc-kube-api-access-c5xb8\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072606 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072618 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc5hm\" (UniqueName: \"kubernetes.io/projected/e2516837-e89c-4f10-a494-cfbedfd917f0-kube-api-access-cc5hm\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.072629 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2516837-e89c-4f10-a494-cfbedfd917f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.073307 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerID="dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1" exitCode=0 Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.073367 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.073376 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerDied","Data":"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1"} Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.073405 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2516837-e89c-4f10-a494-cfbedfd917f0","Type":"ContainerDied","Data":"5afa978066c613a3fd86757ab155b1cee242ecc5f2076d725c4163f991d0403a"} Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.073423 5033 scope.go:117] "RemoveContainer" containerID="1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.075697 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2z6h7" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.076333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2z6h7" event={"ID":"14f7fa82-563b-4906-8e47-0ef910e3993a","Type":"ContainerDied","Data":"795a3b8a8c935771f15632426d4e0aa1c75b5e348da7bc8d4603ba0668c3183a"} Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.076374 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="795a3b8a8c935771f15632426d4e0aa1c75b5e348da7bc8d4603ba0668c3183a" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.089918 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data" (OuterVolumeSpecName: "config-data") pod "e2516837-e89c-4f10-a494-cfbedfd917f0" (UID: "e2516837-e89c-4f10-a494-cfbedfd917f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.109557 5033 scope.go:117] "RemoveContainer" containerID="5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.158953 5033 scope.go:117] "RemoveContainer" containerID="ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.175337 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2516837-e89c-4f10-a494-cfbedfd917f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.198886 5033 scope.go:117] "RemoveContainer" containerID="dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.236442 5033 scope.go:117] "RemoveContainer" containerID="1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.236956 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1\": container with ID starting with 1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1 not found: ID does not exist" containerID="1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.237011 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1"} err="failed to get container status \"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1\": rpc error: code = NotFound desc = could not find container \"1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1\": container with ID starting with 1453b86c9f561a90752f5635e51009ffc711c16ce5dedfd43c28a52f2b9897b1 not found: ID does not exist" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.237045 5033 scope.go:117] "RemoveContainer" containerID="5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.237664 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b\": container with ID starting with 5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b not found: ID does not exist" containerID="5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.237696 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b"} err="failed to get container status \"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b\": rpc error: code = NotFound desc = could not find container \"5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b\": container with ID starting with 5710ba79adf55ab97abb10c44a685ae4ba3b4176d22555add326172e1039407b not found: ID does not exist" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.237715 5033 scope.go:117] "RemoveContainer" containerID="ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.238504 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08\": container with ID starting with ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08 not found: ID does not exist" containerID="ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.238537 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08"} err="failed to get container status \"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08\": rpc error: code = NotFound desc = could not find container \"ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08\": container with ID starting with ec8b800ad79e66c6e2a63c59688f06fbc4e6eca8159616b4340f051f81944c08 not found: ID does not exist" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.238553 5033 scope.go:117] "RemoveContainer" containerID="dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.238947 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1\": container with ID starting with dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1 not found: ID does not exist" containerID="dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.238971 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1"} err="failed to get container status \"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1\": rpc error: code = NotFound desc = could not find container \"dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1\": container with ID starting with dd6fc3e9c9f4d9ad5f7f98e057693b4f1857e373b3312aed5ea4498fc0b3cbd1 not found: ID does not exist" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.410992 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.431077 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456069 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456471 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d670e98-83a9-43c0-b723-caf42651cfcc" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456486 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d670e98-83a9-43c0-b723-caf42651cfcc" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456510 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-central-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456516 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-central-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456538 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="proxy-httpd" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456544 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="proxy-httpd" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456556 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f7fa82-563b-4906-8e47-0ef910e3993a" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456561 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f7fa82-563b-4906-8e47-0ef910e3993a" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456577 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="sg-core" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456584 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="sg-core" Mar 19 19:17:20 crc kubenswrapper[5033]: E0319 19:17:20.456600 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-notification-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456606 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-notification-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456770 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d670e98-83a9-43c0-b723-caf42651cfcc" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456783 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f7fa82-563b-4906-8e47-0ef910e3993a" containerName="mariadb-database-create" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456792 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-notification-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456821 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="proxy-httpd" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456841 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="ceilometer-central-agent" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.456849 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" containerName="sg-core" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.459296 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.462327 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.462604 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.471546 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.483764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.483841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.483870 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjfh\" (UniqueName: \"kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.483936 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.484914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.484953 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.484999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.591490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.591534 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.591592 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.592118 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.593170 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.593215 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.593236 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjfh\" (UniqueName: \"kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.593283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.593617 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.600035 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.600377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.600399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.600615 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.610482 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjfh\" (UniqueName: \"kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh\") pod \"ceilometer-0\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.644650 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2516837-e89c-4f10-a494-cfbedfd917f0" path="/var/lib/kubelet/pods/e2516837-e89c-4f10-a494-cfbedfd917f0/volumes" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.789324 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.815045 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.898932 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts\") pod \"f469bfad-5e99-414b-b44c-18c5fde5a96b\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.899101 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfh9m\" (UniqueName: \"kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m\") pod \"f469bfad-5e99-414b-b44c-18c5fde5a96b\" (UID: \"f469bfad-5e99-414b-b44c-18c5fde5a96b\") " Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.900028 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f469bfad-5e99-414b-b44c-18c5fde5a96b" (UID: "f469bfad-5e99-414b-b44c-18c5fde5a96b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:20 crc kubenswrapper[5033]: I0319 19:17:20.936054 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m" (OuterVolumeSpecName: "kube-api-access-nfh9m") pod "f469bfad-5e99-414b-b44c-18c5fde5a96b" (UID: "f469bfad-5e99-414b-b44c-18c5fde5a96b"). InnerVolumeSpecName "kube-api-access-nfh9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.016570 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f469bfad-5e99-414b-b44c-18c5fde5a96b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.016611 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfh9m\" (UniqueName: \"kubernetes.io/projected/f469bfad-5e99-414b-b44c-18c5fde5a96b-kube-api-access-nfh9m\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.099565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8350-account-create-update-7bhxl" event={"ID":"f469bfad-5e99-414b-b44c-18c5fde5a96b","Type":"ContainerDied","Data":"885e972a55f386719766e96b404908662f122302403ae3beea9f47e0136bfe54"} Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.099619 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885e972a55f386719766e96b404908662f122302403ae3beea9f47e0136bfe54" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.099674 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8350-account-create-update-7bhxl" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.155208 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.161734 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.165572 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.222248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts\") pod \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.223672 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts\") pod \"49f78ec8-661f-4f3b-88b6-cc687515ba76\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.223421 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ac10a03-94cf-460b-9d47-305d6ffaa16a" (UID: "0ac10a03-94cf-460b-9d47-305d6ffaa16a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.223753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8srs\" (UniqueName: \"kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs\") pod \"49f78ec8-661f-4f3b-88b6-cc687515ba76\" (UID: \"49f78ec8-661f-4f3b-88b6-cc687515ba76\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.224122 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49f78ec8-661f-4f3b-88b6-cc687515ba76" (UID: "49f78ec8-661f-4f3b-88b6-cc687515ba76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.224746 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49czk\" (UniqueName: \"kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk\") pod \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\" (UID: \"0ac10a03-94cf-460b-9d47-305d6ffaa16a\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.224783 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts\") pod \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.224860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clzs\" (UniqueName: \"kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs\") pod \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\" (UID: \"b83e6b50-dafa-4bb0-96d1-89ad756e947a\") " Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.225936 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ac10a03-94cf-460b-9d47-305d6ffaa16a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.225952 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f78ec8-661f-4f3b-88b6-cc687515ba76-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.226800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b83e6b50-dafa-4bb0-96d1-89ad756e947a" (UID: "b83e6b50-dafa-4bb0-96d1-89ad756e947a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.252949 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk" (OuterVolumeSpecName: "kube-api-access-49czk") pod "0ac10a03-94cf-460b-9d47-305d6ffaa16a" (UID: "0ac10a03-94cf-460b-9d47-305d6ffaa16a"). InnerVolumeSpecName "kube-api-access-49czk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.253092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs" (OuterVolumeSpecName: "kube-api-access-h8srs") pod "49f78ec8-661f-4f3b-88b6-cc687515ba76" (UID: "49f78ec8-661f-4f3b-88b6-cc687515ba76"). InnerVolumeSpecName "kube-api-access-h8srs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.253116 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs" (OuterVolumeSpecName: "kube-api-access-6clzs") pod "b83e6b50-dafa-4bb0-96d1-89ad756e947a" (UID: "b83e6b50-dafa-4bb0-96d1-89ad756e947a"). InnerVolumeSpecName "kube-api-access-6clzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.327352 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49czk\" (UniqueName: \"kubernetes.io/projected/0ac10a03-94cf-460b-9d47-305d6ffaa16a-kube-api-access-49czk\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.327384 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83e6b50-dafa-4bb0-96d1-89ad756e947a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.327392 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clzs\" (UniqueName: \"kubernetes.io/projected/b83e6b50-dafa-4bb0-96d1-89ad756e947a-kube-api-access-6clzs\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.327402 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8srs\" (UniqueName: \"kubernetes.io/projected/49f78ec8-661f-4f3b-88b6-cc687515ba76-kube-api-access-h8srs\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[5033]: I0319 19:17:21.418079 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:21 crc kubenswrapper[5033]: W0319 19:17:21.427549 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80fd634e_a900_449a_9ff8_0c8dc555e36c.slice/crio-3319c2129197e19b0dee46743c437a9e5cb198194b556597d48de874ca7985dd WatchSource:0}: Error finding container 3319c2129197e19b0dee46743c437a9e5cb198194b556597d48de874ca7985dd: Status 404 returned error can't find the container with id 3319c2129197e19b0dee46743c437a9e5cb198194b556597d48de874ca7985dd Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.134988 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.135000 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9863-account-create-update-hwtsv" event={"ID":"49f78ec8-661f-4f3b-88b6-cc687515ba76","Type":"ContainerDied","Data":"141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9"} Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.136361 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="141b5500771113f7ac8ae8ee070606dd91b259ca50651c32d5fc3bf4b6d23eb9" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.145724 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" event={"ID":"b83e6b50-dafa-4bb0-96d1-89ad756e947a","Type":"ContainerDied","Data":"4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0"} Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.145766 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4410247a179f4691f4234c255836a50dcaaee75a1ad03f8f0d164be7d06c9ea0" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.145843 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-08d0-account-create-update-5mrwk" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.152427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerStarted","Data":"3319c2129197e19b0dee46743c437a9e5cb198194b556597d48de874ca7985dd"} Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.158170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d5pz8" event={"ID":"0ac10a03-94cf-460b-9d47-305d6ffaa16a","Type":"ContainerDied","Data":"4c946c9248b628aca2208fe096c8961a69c729ad2df8090e9abb217dcd9c9880"} Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.158217 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c946c9248b628aca2208fe096c8961a69c729ad2df8090e9abb217dcd9c9880" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.158245 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d5pz8" Mar 19 19:17:22 crc kubenswrapper[5033]: I0319 19:17:22.708672 5033 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbe100139-e76b-4549-a7a7-652dcd18354b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbe100139-e76b-4549-a7a7-652dcd18354b] : Timed out while waiting for systemd to remove kubepods-besteffort-podbe100139_e76b_4549_a7a7_652dcd18354b.slice" Mar 19 19:17:24 crc kubenswrapper[5033]: I0319 19:17:24.177416 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerStarted","Data":"90188090d5f8db174a8b9fbf7f7cf5693044fe3c527d679527fc9092456b9a8d"} Mar 19 19:17:24 crc kubenswrapper[5033]: I0319 19:17:24.177951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerStarted","Data":"11f12147f7ca93e7ddc7a2f10811187b0d98c8fe525eaf914a50914419d8ea56"} Mar 19 19:17:25 crc kubenswrapper[5033]: I0319 19:17:25.194391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerStarted","Data":"3473e4ce35a32bd0f3d16927c60c7ee9e0446057a4193d5e0dd9bf7d4ee40e18"} Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.729548 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kpkmt"] Mar 19 19:17:26 crc kubenswrapper[5033]: E0319 19:17:26.730473 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83e6b50-dafa-4bb0-96d1-89ad756e947a" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730487 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83e6b50-dafa-4bb0-96d1-89ad756e947a" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: E0319 19:17:26.730500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac10a03-94cf-460b-9d47-305d6ffaa16a" containerName="mariadb-database-create" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730506 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac10a03-94cf-460b-9d47-305d6ffaa16a" containerName="mariadb-database-create" Mar 19 19:17:26 crc kubenswrapper[5033]: E0319 19:17:26.730531 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f469bfad-5e99-414b-b44c-18c5fde5a96b" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730537 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f469bfad-5e99-414b-b44c-18c5fde5a96b" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: E0319 19:17:26.730551 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f78ec8-661f-4f3b-88b6-cc687515ba76" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730556 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f78ec8-661f-4f3b-88b6-cc687515ba76" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730739 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac10a03-94cf-460b-9d47-305d6ffaa16a" containerName="mariadb-database-create" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730752 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f469bfad-5e99-414b-b44c-18c5fde5a96b" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730770 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f78ec8-661f-4f3b-88b6-cc687515ba76" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.730783 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83e6b50-dafa-4bb0-96d1-89ad756e947a" containerName="mariadb-account-create-update" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.731515 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.736783 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.737329 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.747472 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kpkmt"] Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.747854 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jkdj6" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.756330 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddld\" (UniqueName: \"kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.756408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.756520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.756549 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.857728 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.857799 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.857869 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddld\" (UniqueName: \"kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.857924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.863831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.864289 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.865783 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:26 crc kubenswrapper[5033]: I0319 19:17:26.877418 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddld\" (UniqueName: \"kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld\") pod \"nova-cell0-conductor-db-sync-kpkmt\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:27 crc kubenswrapper[5033]: I0319 19:17:27.097407 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:27 crc kubenswrapper[5033]: I0319 19:17:27.216879 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerStarted","Data":"7941b6a8ef5a5954da1cf9af48a835cfbe2bab512a6b35d07723496f41ee9793"} Mar 19 19:17:27 crc kubenswrapper[5033]: I0319 19:17:27.218142 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:17:27 crc kubenswrapper[5033]: I0319 19:17:27.244317 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.409052666 podStartE2EDuration="7.244299833s" podCreationTimestamp="2026-03-19 19:17:20 +0000 UTC" firstStartedPulling="2026-03-19 19:17:21.432092225 +0000 UTC m=+1251.537122074" lastFinishedPulling="2026-03-19 19:17:26.267339402 +0000 UTC m=+1256.372369241" observedRunningTime="2026-03-19 19:17:27.243749928 +0000 UTC m=+1257.348779777" watchObservedRunningTime="2026-03-19 19:17:27.244299833 +0000 UTC m=+1257.349329682" Mar 19 19:17:27 crc kubenswrapper[5033]: I0319 19:17:27.821367 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kpkmt"] Mar 19 19:17:28 crc kubenswrapper[5033]: I0319 19:17:28.229187 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" event={"ID":"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7","Type":"ContainerStarted","Data":"6dd7d386693acfc15a4a7233a4867a00823787b7b504cb547689e83ef16504f3"} Mar 19 19:17:39 crc kubenswrapper[5033]: I0319 19:17:39.335254 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" event={"ID":"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7","Type":"ContainerStarted","Data":"4f6d5113c10a077e188734d798b0443e95e057d5c99bd69653518cb220db3329"} Mar 19 19:17:40 crc kubenswrapper[5033]: I0319 19:17:40.758791 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:17:40 crc kubenswrapper[5033]: I0319 19:17:40.759106 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:17:49 crc kubenswrapper[5033]: I0319 19:17:49.430291 5033 generic.go:334] "Generic (PLEG): container finished" podID="ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" containerID="4f6d5113c10a077e188734d798b0443e95e057d5c99bd69653518cb220db3329" exitCode=0 Mar 19 19:17:49 crc kubenswrapper[5033]: I0319 19:17:49.430368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" event={"ID":"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7","Type":"ContainerDied","Data":"4f6d5113c10a077e188734d798b0443e95e057d5c99bd69653518cb220db3329"} Mar 19 19:17:50 crc kubenswrapper[5033]: I0319 19:17:50.831714 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.259695 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.394441 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddld\" (UniqueName: \"kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld\") pod \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.394537 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts\") pod \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.394572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data\") pod \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.394755 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle\") pod \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\" (UID: \"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7\") " Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.405592 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts" (OuterVolumeSpecName: "scripts") pod "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" (UID: "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.405747 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld" (OuterVolumeSpecName: "kube-api-access-4ddld") pod "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" (UID: "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7"). InnerVolumeSpecName "kube-api-access-4ddld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.426857 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data" (OuterVolumeSpecName: "config-data") pod "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" (UID: "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.443083 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" (UID: "ae335ad1-43b7-4f59-a78a-bfe88ed68cd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.453910 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" event={"ID":"ae335ad1-43b7-4f59-a78a-bfe88ed68cd7","Type":"ContainerDied","Data":"6dd7d386693acfc15a4a7233a4867a00823787b7b504cb547689e83ef16504f3"} Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.453966 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd7d386693acfc15a4a7233a4867a00823787b7b504cb547689e83ef16504f3" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.454019 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kpkmt" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.497581 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.497618 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddld\" (UniqueName: \"kubernetes.io/projected/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-kube-api-access-4ddld\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.497632 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.497644 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.557869 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:17:51 crc kubenswrapper[5033]: E0319 19:17:51.558420 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" containerName="nova-cell0-conductor-db-sync" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.558463 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" containerName="nova-cell0-conductor-db-sync" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.558724 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" containerName="nova-cell0-conductor-db-sync" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.559769 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.562426 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.562743 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jkdj6" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.574900 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.705460 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.705544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.705661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthtq\" (UniqueName: \"kubernetes.io/projected/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-kube-api-access-tthtq\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.807068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.807133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.807212 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthtq\" (UniqueName: \"kubernetes.io/projected/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-kube-api-access-tthtq\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.814235 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.820518 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.826043 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthtq\" (UniqueName: \"kubernetes.io/projected/9da9faaa-b095-42c9-90d9-d6b0dac1b3c9-kube-api-access-tthtq\") pod \"nova-cell0-conductor-0\" (UID: \"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:51 crc kubenswrapper[5033]: I0319 19:17:51.891372 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:52 crc kubenswrapper[5033]: I0319 19:17:52.410820 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:17:52 crc kubenswrapper[5033]: I0319 19:17:52.465669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9","Type":"ContainerStarted","Data":"55f3c7fbc318e7602bd651884f2c0381e2717c1a0093db278d01e07b181d5d7b"} Mar 19 19:17:53 crc kubenswrapper[5033]: I0319 19:17:53.475011 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9da9faaa-b095-42c9-90d9-d6b0dac1b3c9","Type":"ContainerStarted","Data":"6d8b39b7ed3e5c98d61b0540a3a6d95453f53cdbf565c2a4838a186fa03b17dd"} Mar 19 19:17:53 crc kubenswrapper[5033]: I0319 19:17:53.475404 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 19:17:55 crc kubenswrapper[5033]: I0319 19:17:55.654870 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=4.654854166 podStartE2EDuration="4.654854166s" podCreationTimestamp="2026-03-19 19:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:53.492101974 +0000 UTC m=+1283.597131823" watchObservedRunningTime="2026-03-19 19:17:55.654854166 +0000 UTC m=+1285.759884015" Mar 19 19:17:55 crc kubenswrapper[5033]: I0319 19:17:55.664318 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:55 crc kubenswrapper[5033]: I0319 19:17:55.664514 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" containerName="kube-state-metrics" containerID="cri-o://8ccb34d2c903bbca2ba09f3303239e9f3a44fc474f9a19c5f0b8409a8ea5ffb4" gracePeriod=30 Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.509037 5033 generic.go:334] "Generic (PLEG): container finished" podID="2baa96fb-8508-4335-b43e-4ec2da1af123" containerID="8ccb34d2c903bbca2ba09f3303239e9f3a44fc474f9a19c5f0b8409a8ea5ffb4" exitCode=2 Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.509077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2baa96fb-8508-4335-b43e-4ec2da1af123","Type":"ContainerDied","Data":"8ccb34d2c903bbca2ba09f3303239e9f3a44fc474f9a19c5f0b8409a8ea5ffb4"} Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.509101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2baa96fb-8508-4335-b43e-4ec2da1af123","Type":"ContainerDied","Data":"4f40039fca189e3cad106d3f5411ce6a56494518e303cd3df586ea625b731893"} Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.509114 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f40039fca189e3cad106d3f5411ce6a56494518e303cd3df586ea625b731893" Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.550417 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.579965 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrkm\" (UniqueName: \"kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm\") pod \"2baa96fb-8508-4335-b43e-4ec2da1af123\" (UID: \"2baa96fb-8508-4335-b43e-4ec2da1af123\") " Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.589539 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm" (OuterVolumeSpecName: "kube-api-access-knrkm") pod "2baa96fb-8508-4335-b43e-4ec2da1af123" (UID: "2baa96fb-8508-4335-b43e-4ec2da1af123"). InnerVolumeSpecName "kube-api-access-knrkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:56 crc kubenswrapper[5033]: I0319 19:17:56.682718 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrkm\" (UniqueName: \"kubernetes.io/projected/2baa96fb-8508-4335-b43e-4ec2da1af123-kube-api-access-knrkm\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.520160 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.546253 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.561384 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.593440 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:57 crc kubenswrapper[5033]: E0319 19:17:57.594476 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" containerName="kube-state-metrics" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.594505 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" containerName="kube-state-metrics" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.595037 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" containerName="kube-state-metrics" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.599574 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.604956 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.605602 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.625997 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.702808 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.702874 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l7zl\" (UniqueName: \"kubernetes.io/projected/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-api-access-7l7zl\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.702932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.703099 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.805191 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.805336 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.805406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.805435 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l7zl\" (UniqueName: \"kubernetes.io/projected/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-api-access-7l7zl\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.810156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.811949 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.819086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855aaf56-adc4-45b2-a632-afc1aeb26f79-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.830095 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l7zl\" (UniqueName: \"kubernetes.io/projected/855aaf56-adc4-45b2-a632-afc1aeb26f79-kube-api-access-7l7zl\") pod \"kube-state-metrics-0\" (UID: \"855aaf56-adc4-45b2-a632-afc1aeb26f79\") " pod="openstack/kube-state-metrics-0" Mar 19 19:17:57 crc kubenswrapper[5033]: I0319 19:17:57.920074 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.090184 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.090470 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-central-agent" containerID="cri-o://11f12147f7ca93e7ddc7a2f10811187b0d98c8fe525eaf914a50914419d8ea56" gracePeriod=30 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.090584 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="proxy-httpd" containerID="cri-o://7941b6a8ef5a5954da1cf9af48a835cfbe2bab512a6b35d07723496f41ee9793" gracePeriod=30 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.090646 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="sg-core" containerID="cri-o://3473e4ce35a32bd0f3d16927c60c7ee9e0446057a4193d5e0dd9bf7d4ee40e18" gracePeriod=30 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.090685 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-notification-agent" containerID="cri-o://90188090d5f8db174a8b9fbf7f7cf5693044fe3c527d679527fc9092456b9a8d" gracePeriod=30 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.404209 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.531120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"855aaf56-adc4-45b2-a632-afc1aeb26f79","Type":"ContainerStarted","Data":"241467429ca0a99adc9f878202aeaa235f9a2f2f0fcdef44e46040799cd1f4fd"} Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534553 5033 generic.go:334] "Generic (PLEG): container finished" podID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerID="7941b6a8ef5a5954da1cf9af48a835cfbe2bab512a6b35d07723496f41ee9793" exitCode=0 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534577 5033 generic.go:334] "Generic (PLEG): container finished" podID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerID="3473e4ce35a32bd0f3d16927c60c7ee9e0446057a4193d5e0dd9bf7d4ee40e18" exitCode=2 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534604 5033 generic.go:334] "Generic (PLEG): container finished" podID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerID="11f12147f7ca93e7ddc7a2f10811187b0d98c8fe525eaf914a50914419d8ea56" exitCode=0 Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534662 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerDied","Data":"7941b6a8ef5a5954da1cf9af48a835cfbe2bab512a6b35d07723496f41ee9793"} Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534734 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerDied","Data":"3473e4ce35a32bd0f3d16927c60c7ee9e0446057a4193d5e0dd9bf7d4ee40e18"} Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.534749 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerDied","Data":"11f12147f7ca93e7ddc7a2f10811187b0d98c8fe525eaf914a50914419d8ea56"} Mar 19 19:17:58 crc kubenswrapper[5033]: I0319 19:17:58.630822 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2baa96fb-8508-4335-b43e-4ec2da1af123" path="/var/lib/kubelet/pods/2baa96fb-8508-4335-b43e-4ec2da1af123/volumes" Mar 19 19:17:59 crc kubenswrapper[5033]: I0319 19:17:59.544084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"855aaf56-adc4-45b2-a632-afc1aeb26f79","Type":"ContainerStarted","Data":"e54ee49dc6d33b2b6c3824481ec779d23e0f20fca8e688205e72888510cc597d"} Mar 19 19:17:59 crc kubenswrapper[5033]: I0319 19:17:59.544475 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 19:17:59 crc kubenswrapper[5033]: I0319 19:17:59.567235 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.114909192 podStartE2EDuration="2.567216377s" podCreationTimestamp="2026-03-19 19:17:57 +0000 UTC" firstStartedPulling="2026-03-19 19:17:58.424910843 +0000 UTC m=+1288.529940682" lastFinishedPulling="2026-03-19 19:17:58.877218008 +0000 UTC m=+1288.982247867" observedRunningTime="2026-03-19 19:17:59.559206191 +0000 UTC m=+1289.664236030" watchObservedRunningTime="2026-03-19 19:17:59.567216377 +0000 UTC m=+1289.672246226" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.132307 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565798-7lv97"] Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.133862 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.137038 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.137471 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.143127 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.144718 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-7lv97"] Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.182297 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqk9\" (UniqueName: \"kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9\") pod \"auto-csr-approver-29565798-7lv97\" (UID: \"96e45587-5c12-422b-850f-782805c2169a\") " pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.284539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqk9\" (UniqueName: \"kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9\") pod \"auto-csr-approver-29565798-7lv97\" (UID: \"96e45587-5c12-422b-850f-782805c2169a\") " pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.301422 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqk9\" (UniqueName: \"kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9\") pod \"auto-csr-approver-29565798-7lv97\" (UID: \"96e45587-5c12-422b-850f-782805c2169a\") " pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:00 crc kubenswrapper[5033]: I0319 19:18:00.451439 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:01 crc kubenswrapper[5033]: I0319 19:18:01.046998 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-7lv97"] Mar 19 19:18:01 crc kubenswrapper[5033]: I0319 19:18:01.565693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-7lv97" event={"ID":"96e45587-5c12-422b-850f-782805c2169a","Type":"ContainerStarted","Data":"dfdc77e9eff73aff98426fa14ea918d86255eebb84752825450705505b04e145"} Mar 19 19:18:01 crc kubenswrapper[5033]: I0319 19:18:01.568787 5033 generic.go:334] "Generic (PLEG): container finished" podID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerID="90188090d5f8db174a8b9fbf7f7cf5693044fe3c527d679527fc9092456b9a8d" exitCode=0 Mar 19 19:18:01 crc kubenswrapper[5033]: I0319 19:18:01.568832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerDied","Data":"90188090d5f8db174a8b9fbf7f7cf5693044fe3c527d679527fc9092456b9a8d"} Mar 19 19:18:01 crc kubenswrapper[5033]: I0319 19:18:01.927108 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.151003 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236161 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236247 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236310 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236330 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjfh\" (UniqueName: \"kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236460 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236494 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.236597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml\") pod \"80fd634e-a900-449a-9ff8-0c8dc555e36c\" (UID: \"80fd634e-a900-449a-9ff8-0c8dc555e36c\") " Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.237678 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.241405 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.245600 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh" (OuterVolumeSpecName: "kube-api-access-dqjfh") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "kube-api-access-dqjfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.250599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts" (OuterVolumeSpecName: "scripts") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.294943 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.340160 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.340193 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqjfh\" (UniqueName: \"kubernetes.io/projected/80fd634e-a900-449a-9ff8-0c8dc555e36c-kube-api-access-dqjfh\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.340203 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.340213 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80fd634e-a900-449a-9ff8-0c8dc555e36c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.340222 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.414823 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.417663 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data" (OuterVolumeSpecName: "config-data") pod "80fd634e-a900-449a-9ff8-0c8dc555e36c" (UID: "80fd634e-a900-449a-9ff8-0c8dc555e36c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.442683 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.442717 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fd634e-a900-449a-9ff8-0c8dc555e36c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.482672 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wfb9t"] Mar 19 19:18:02 crc kubenswrapper[5033]: E0319 19:18:02.483184 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-central-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483201 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-central-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: E0319 19:18:02.483217 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="proxy-httpd" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483224 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="proxy-httpd" Mar 19 19:18:02 crc kubenswrapper[5033]: E0319 19:18:02.483243 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-notification-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483251 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-notification-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: E0319 19:18:02.483266 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="sg-core" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483272 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="sg-core" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483445 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-notification-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483477 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="proxy-httpd" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483485 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="sg-core" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.483500 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" containerName="ceilometer-central-agent" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.484179 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.487077 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.487291 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.491325 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfb9t"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.544682 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.544734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8jcd\" (UniqueName: \"kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.544843 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.544899 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.589327 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-7lv97" event={"ID":"96e45587-5c12-422b-850f-782805c2169a","Type":"ContainerStarted","Data":"a82629c2564dd1ad6bd44e6f1be1bfd8f75714bca8c9c549103ad5a2cc961820"} Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.595630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80fd634e-a900-449a-9ff8-0c8dc555e36c","Type":"ContainerDied","Data":"3319c2129197e19b0dee46743c437a9e5cb198194b556597d48de874ca7985dd"} Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.595685 5033 scope.go:117] "RemoveContainer" containerID="7941b6a8ef5a5954da1cf9af48a835cfbe2bab512a6b35d07723496f41ee9793" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.595701 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.628549 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565798-7lv97" podStartSLOduration=1.808334758 podStartE2EDuration="2.628449126s" podCreationTimestamp="2026-03-19 19:18:00 +0000 UTC" firstStartedPulling="2026-03-19 19:18:01.025669346 +0000 UTC m=+1291.130699185" lastFinishedPulling="2026-03-19 19:18:01.845783704 +0000 UTC m=+1291.950813553" observedRunningTime="2026-03-19 19:18:02.609135621 +0000 UTC m=+1292.714165460" watchObservedRunningTime="2026-03-19 19:18:02.628449126 +0000 UTC m=+1292.733478975" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.639202 5033 scope.go:117] "RemoveContainer" containerID="3473e4ce35a32bd0f3d16927c60c7ee9e0446057a4193d5e0dd9bf7d4ee40e18" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.646664 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.646739 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.646786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.646809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8jcd\" (UniqueName: \"kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.664272 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.665177 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.666634 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.668122 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.670337 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.673653 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.674698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8jcd\" (UniqueName: \"kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd\") pod \"nova-cell0-cell-mapping-wfb9t\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.691263 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.719682 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.721811 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.734279 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.751979 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.759589 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.759639 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvnx5\" (UniqueName: \"kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.759759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.777762 5033 scope.go:117] "RemoveContainer" containerID="90188090d5f8db174a8b9fbf7f7cf5693044fe3c527d679527fc9092456b9a8d" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.788110 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.831946 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.834769 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.843242 5033 scope.go:117] "RemoveContainer" containerID="11f12147f7ca93e7ddc7a2f10811187b0d98c8fe525eaf914a50914419d8ea56" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.865205 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.873760 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.873897 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.874183 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.874309 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqzm\" (UniqueName: \"kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.874601 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.874692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvnx5\" (UniqueName: \"kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.907257 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.908214 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.908627 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvnx5\" (UniqueName: \"kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.928901 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.935887 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.941437 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.968250 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978453 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqzm\" (UniqueName: \"kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwtm\" (UniqueName: \"kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978695 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.978785 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.980352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.987765 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.988517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.990966 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.995451 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.995717 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:18:02 crc kubenswrapper[5033]: I0319 19:18:02.995908 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.003356 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.004026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqzm\" (UniqueName: \"kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm\") pod \"nova-metadata-0\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " pod="openstack/nova-metadata-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.052215 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.077659 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.079646 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.082888 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.082957 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083015 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083076 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csx5d\" (UniqueName: \"kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083127 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083147 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwtm\" (UniqueName: \"kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.083248 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.092665 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.094734 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.110070 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.110851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.112263 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwtm\" (UniqueName: \"kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm\") pod \"nova-scheduler-0\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.112509 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.119090 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.136979 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.157199 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.172623 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185704 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkrr\" (UniqueName: \"kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185766 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9v72\" (UniqueName: \"kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185831 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185857 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185880 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185895 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.185961 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csx5d\" (UniqueName: \"kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186155 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186174 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186189 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186232 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186255 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186276 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.186318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.188045 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.191598 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.192502 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.194372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.195112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.195898 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.200078 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.216346 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csx5d\" (UniqueName: \"kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d\") pod \"ceilometer-0\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289939 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.289982 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.290006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.290029 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.290056 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkrr\" (UniqueName: \"kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.290087 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9v72\" (UniqueName: \"kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.291166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.293063 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.293577 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.297641 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.298805 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.298961 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.306374 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.307934 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.308371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.312245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9v72\" (UniqueName: \"kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72\") pod \"dnsmasq-dns-78cd565959-8rq6r\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.320987 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkrr\" (UniqueName: \"kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr\") pod \"nova-api-0\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.336990 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.400554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.455030 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.636303 5033 generic.go:334] "Generic (PLEG): container finished" podID="96e45587-5c12-422b-850f-782805c2169a" containerID="a82629c2564dd1ad6bd44e6f1be1bfd8f75714bca8c9c549103ad5a2cc961820" exitCode=0 Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.636578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-7lv97" event={"ID":"96e45587-5c12-422b-850f-782805c2169a","Type":"ContainerDied","Data":"a82629c2564dd1ad6bd44e6f1be1bfd8f75714bca8c9c549103ad5a2cc961820"} Mar 19 19:18:03 crc kubenswrapper[5033]: I0319 19:18:03.645058 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfb9t"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.153047 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.436526 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.440642 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.515579 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-77vvw"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.518177 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.523186 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.523397 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.537030 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-77vvw"] Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.637831 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80fd634e-a900-449a-9ff8-0c8dc555e36c" path="/var/lib/kubelet/pods/80fd634e-a900-449a-9ff8-0c8dc555e36c/volumes" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.651237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfb9t" event={"ID":"97baf414-f658-4c71-83a5-a0ecd6d09e90","Type":"ContainerStarted","Data":"cdbe1c534ce22b52c8228d973137f9955dc6f9bbf447ba7ad1411b7d356bfccf"} Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.651279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfb9t" event={"ID":"97baf414-f658-4c71-83a5-a0ecd6d09e90","Type":"ContainerStarted","Data":"9d46a88665ca4d1065501e927bc54febb7640e233f30344075e2d20045eb6a0e"} Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.653364 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e57df54-2285-4657-b3aa-4cdaf6a81269","Type":"ContainerStarted","Data":"96795bfdd6eaae133315d57d11886328cedd930e9bc3c17c24d95c32599fcc33"} Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.654528 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38730c07-3023-4319-a997-da522874a9aa","Type":"ContainerStarted","Data":"9f5ca3f23bd3e80537b871e2b0af5b077e5fb10cb9fa7d2868d1cd8a4736b94c"} Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.655751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerStarted","Data":"b629cff1fd3714e56e93f5e60ce16264465a1d0f8c58801ec8b9367419760d0b"} Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.663369 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.666860 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpz22\" (UniqueName: \"kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.667974 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.671349 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.674550 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wfb9t" podStartSLOduration=2.674528757 podStartE2EDuration="2.674528757s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:04.670266297 +0000 UTC m=+1294.775296146" watchObservedRunningTime="2026-03-19 19:18:04.674528757 +0000 UTC m=+1294.779558606" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.773881 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.774270 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.774401 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpz22\" (UniqueName: \"kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.774506 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.782752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.784209 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.796904 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.799308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpz22\" (UniqueName: \"kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22\") pod \"nova-cell1-conductor-db-sync-77vvw\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:04 crc kubenswrapper[5033]: I0319 19:18:04.856938 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.205074 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.221516 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.246490 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.605998 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-77vvw"] Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.745719 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-77vvw" event={"ID":"3bdced5f-fb99-4010-b386-f944a01767df","Type":"ContainerStarted","Data":"76154d559b388f536ccf78588acac363362ec0f475c7beb3ac763fbb8687f3de"} Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.755772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" event={"ID":"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35","Type":"ContainerStarted","Data":"984581f8fb999b1d5dfd3809202be1d3e6a3c7efcde1529186bc65d479543cc1"} Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.773023 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.783160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerStarted","Data":"ca5fdd4ee473a565342cd736552070e1f9efed135dc253399698d28e4e5392cd"} Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.794902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerStarted","Data":"78d873399f5cad589e7912ee3384bd6565a2250335a1b5db3999de195ec25c88"} Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.801603 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-7lv97" Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.801764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-7lv97" event={"ID":"96e45587-5c12-422b-850f-782805c2169a","Type":"ContainerDied","Data":"dfdc77e9eff73aff98426fa14ea918d86255eebb84752825450705505b04e145"} Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.801792 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfdc77e9eff73aff98426fa14ea918d86255eebb84752825450705505b04e145" Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.941911 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxqk9\" (UniqueName: \"kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9\") pod \"96e45587-5c12-422b-850f-782805c2169a\" (UID: \"96e45587-5c12-422b-850f-782805c2169a\") " Mar 19 19:18:05 crc kubenswrapper[5033]: I0319 19:18:05.951421 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9" (OuterVolumeSpecName: "kube-api-access-qxqk9") pod "96e45587-5c12-422b-850f-782805c2169a" (UID: "96e45587-5c12-422b-850f-782805c2169a"). InnerVolumeSpecName "kube-api-access-qxqk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.046184 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxqk9\" (UniqueName: \"kubernetes.io/projected/96e45587-5c12-422b-850f-782805c2169a-kube-api-access-qxqk9\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.492942 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.508639 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.827361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-77vvw" event={"ID":"3bdced5f-fb99-4010-b386-f944a01767df","Type":"ContainerStarted","Data":"422717fffafbccb1ed80dd1b95be440f70bd0c84c74665fdee32247cc6b72f45"} Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.832315 5033 generic.go:334] "Generic (PLEG): container finished" podID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerID="0b083343ad67b6a25ee23998bfd6a2acda3d55515c740d00efe84ae16f3c283d" exitCode=0 Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.832409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" event={"ID":"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35","Type":"ContainerDied","Data":"0b083343ad67b6a25ee23998bfd6a2acda3d55515c740d00efe84ae16f3c283d"} Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.843427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerStarted","Data":"69777b8be862a94478d21561352106d1a8b69660b3039131b7ac6563f9eaf850"} Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.869679 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-77vvw" podStartSLOduration=2.869659471 podStartE2EDuration="2.869659471s" podCreationTimestamp="2026-03-19 19:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:06.862833598 +0000 UTC m=+1296.967863447" watchObservedRunningTime="2026-03-19 19:18:06.869659471 +0000 UTC m=+1296.974689320" Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.892487 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-b6mbz"] Mar 19 19:18:06 crc kubenswrapper[5033]: I0319 19:18:06.907063 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-b6mbz"] Mar 19 19:18:07 crc kubenswrapper[5033]: I0319 19:18:07.942311 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 19:18:08 crc kubenswrapper[5033]: I0319 19:18:08.638870 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017b2cf3-89c3-438d-a4f2-cc11221cc49a" path="/var/lib/kubelet/pods/017b2cf3-89c3-438d-a4f2-cc11221cc49a/volumes" Mar 19 19:18:09 crc kubenswrapper[5033]: I0319 19:18:09.896368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" event={"ID":"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35","Type":"ContainerStarted","Data":"3daaeb15559cdd14ae0a13a1ac3b4f2f176eb533ec87731bf198329561e9fd76"} Mar 19 19:18:09 crc kubenswrapper[5033]: I0319 19:18:09.896998 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:09 crc kubenswrapper[5033]: I0319 19:18:09.925122 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" podStartSLOduration=7.925091351 podStartE2EDuration="7.925091351s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:09.913847372 +0000 UTC m=+1300.018877221" watchObservedRunningTime="2026-03-19 19:18:09.925091351 +0000 UTC m=+1300.030121200" Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.758279 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.758604 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.925090 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerStarted","Data":"c0f423fd215ccfa1efe693e2b9d558acb629e3fe23d7f385319157763f75b801"} Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.928042 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerStarted","Data":"b5c24e6516126f026933d2df5fa1c622d696f28caae1dcf12d07b6103ab1598f"} Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.933148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e57df54-2285-4657-b3aa-4cdaf6a81269","Type":"ContainerStarted","Data":"890b5f8499ee8ffd2c17b58c5564ce2517c835ed6dab6364dc66b90379783541"} Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.933304 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8e57df54-2285-4657-b3aa-4cdaf6a81269" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://890b5f8499ee8ffd2c17b58c5564ce2517c835ed6dab6364dc66b90379783541" gracePeriod=30 Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.952766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38730c07-3023-4319-a997-da522874a9aa","Type":"ContainerStarted","Data":"714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4"} Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.953852 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.480865603 podStartE2EDuration="8.953840631s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="2026-03-19 19:18:04.147210795 +0000 UTC m=+1294.252240644" lastFinishedPulling="2026-03-19 19:18:09.620185823 +0000 UTC m=+1299.725215672" observedRunningTime="2026-03-19 19:18:10.946584426 +0000 UTC m=+1301.051614275" watchObservedRunningTime="2026-03-19 19:18:10.953840631 +0000 UTC m=+1301.058870480" Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.975794 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-log" containerID="cri-o://4dc22f34403355d654d452b51f8e00f538ce9efd39f45c0befc4e22534640658" gracePeriod=30 Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.976276 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-metadata" containerID="cri-o://ed85073fa542ed90de50fb07d70770740f6af11c7ea66d6b870c30ee7eec070f" gracePeriod=30 Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.976283 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerStarted","Data":"ed85073fa542ed90de50fb07d70770740f6af11c7ea66d6b870c30ee7eec070f"} Mar 19 19:18:10 crc kubenswrapper[5033]: I0319 19:18:10.976324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerStarted","Data":"4dc22f34403355d654d452b51f8e00f538ce9efd39f45c0befc4e22534640658"} Mar 19 19:18:11 crc kubenswrapper[5033]: I0319 19:18:11.020928 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.85619061 podStartE2EDuration="9.020911014s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="2026-03-19 19:18:04.45867725 +0000 UTC m=+1294.563707099" lastFinishedPulling="2026-03-19 19:18:09.623397654 +0000 UTC m=+1299.728427503" observedRunningTime="2026-03-19 19:18:10.984250134 +0000 UTC m=+1301.089279983" watchObservedRunningTime="2026-03-19 19:18:11.020911014 +0000 UTC m=+1301.125940853" Mar 19 19:18:11 crc kubenswrapper[5033]: I0319 19:18:11.987654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerStarted","Data":"70f25493019bf89024c0cb6a9e2a1db1914ee810b49269043aa7bc48c5ff36ce"} Mar 19 19:18:11 crc kubenswrapper[5033]: I0319 19:18:11.989980 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerStarted","Data":"01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880"} Mar 19 19:18:11 crc kubenswrapper[5033]: I0319 19:18:11.993469 5033 generic.go:334] "Generic (PLEG): container finished" podID="7988b549-babc-47eb-903d-4018cae11462" containerID="4dc22f34403355d654d452b51f8e00f538ce9efd39f45c0befc4e22534640658" exitCode=143 Mar 19 19:18:11 crc kubenswrapper[5033]: I0319 19:18:11.993612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerDied","Data":"4dc22f34403355d654d452b51f8e00f538ce9efd39f45c0befc4e22534640658"} Mar 19 19:18:12 crc kubenswrapper[5033]: I0319 19:18:12.015369 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.6493456680000005 podStartE2EDuration="10.0153492s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="2026-03-19 19:18:05.275115614 +0000 UTC m=+1295.380145463" lastFinishedPulling="2026-03-19 19:18:09.641119146 +0000 UTC m=+1299.746148995" observedRunningTime="2026-03-19 19:18:12.011948483 +0000 UTC m=+1302.116978332" watchObservedRunningTime="2026-03-19 19:18:12.0153492 +0000 UTC m=+1302.120379049" Mar 19 19:18:12 crc kubenswrapper[5033]: I0319 19:18:12.015633 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.841578398 podStartE2EDuration="10.015626618s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="2026-03-19 19:18:04.46825587 +0000 UTC m=+1294.573285719" lastFinishedPulling="2026-03-19 19:18:09.64230409 +0000 UTC m=+1299.747333939" observedRunningTime="2026-03-19 19:18:11.04476851 +0000 UTC m=+1301.149798359" watchObservedRunningTime="2026-03-19 19:18:12.015626618 +0000 UTC m=+1302.120656467" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.093552 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.301317 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.301652 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.360097 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.456640 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:18:13 crc kubenswrapper[5033]: I0319 19:18:13.456708 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:18:14 crc kubenswrapper[5033]: I0319 19:18:14.017399 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerStarted","Data":"238f55894c528ed48b9b04164f700458dc6ce6e32a8a80d1551a355e68e08e51"} Mar 19 19:18:14 crc kubenswrapper[5033]: I0319 19:18:14.069880 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:18:14 crc kubenswrapper[5033]: I0319 19:18:14.101614 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.097325575 podStartE2EDuration="12.101586515s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="2026-03-19 19:18:05.275022481 +0000 UTC m=+1295.380052330" lastFinishedPulling="2026-03-19 19:18:13.279283421 +0000 UTC m=+1303.384313270" observedRunningTime="2026-03-19 19:18:14.052597156 +0000 UTC m=+1304.157627025" watchObservedRunningTime="2026-03-19 19:18:14.101586515 +0000 UTC m=+1304.206616384" Mar 19 19:18:14 crc kubenswrapper[5033]: I0319 19:18:14.538721 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.226:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:14 crc kubenswrapper[5033]: I0319 19:18:14.538761 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.226:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:15 crc kubenswrapper[5033]: I0319 19:18:15.030958 5033 generic.go:334] "Generic (PLEG): container finished" podID="97baf414-f658-4c71-83a5-a0ecd6d09e90" containerID="cdbe1c534ce22b52c8228d973137f9955dc6f9bbf447ba7ad1411b7d356bfccf" exitCode=0 Mar 19 19:18:15 crc kubenswrapper[5033]: I0319 19:18:15.031050 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfb9t" event={"ID":"97baf414-f658-4c71-83a5-a0ecd6d09e90","Type":"ContainerDied","Data":"cdbe1c534ce22b52c8228d973137f9955dc6f9bbf447ba7ad1411b7d356bfccf"} Mar 19 19:18:15 crc kubenswrapper[5033]: I0319 19:18:15.033412 5033 generic.go:334] "Generic (PLEG): container finished" podID="3bdced5f-fb99-4010-b386-f944a01767df" containerID="422717fffafbccb1ed80dd1b95be440f70bd0c84c74665fdee32247cc6b72f45" exitCode=0 Mar 19 19:18:15 crc kubenswrapper[5033]: I0319 19:18:15.033476 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-77vvw" event={"ID":"3bdced5f-fb99-4010-b386-f944a01767df","Type":"ContainerDied","Data":"422717fffafbccb1ed80dd1b95be440f70bd0c84c74665fdee32247cc6b72f45"} Mar 19 19:18:15 crc kubenswrapper[5033]: I0319 19:18:15.033904 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.068639 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.068731 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-77vvw" event={"ID":"3bdced5f-fb99-4010-b386-f944a01767df","Type":"ContainerDied","Data":"76154d559b388f536ccf78588acac363362ec0f475c7beb3ac763fbb8687f3de"} Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.069144 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76154d559b388f536ccf78588acac363362ec0f475c7beb3ac763fbb8687f3de" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.071997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wfb9t" event={"ID":"97baf414-f658-4c71-83a5-a0ecd6d09e90","Type":"ContainerDied","Data":"9d46a88665ca4d1065501e927bc54febb7640e233f30344075e2d20045eb6a0e"} Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.072025 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d46a88665ca4d1065501e927bc54febb7640e233f30344075e2d20045eb6a0e" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.075756 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.176048 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts\") pod \"97baf414-f658-4c71-83a5-a0ecd6d09e90\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.176147 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data\") pod \"3bdced5f-fb99-4010-b386-f944a01767df\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.176239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpz22\" (UniqueName: \"kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22\") pod \"3bdced5f-fb99-4010-b386-f944a01767df\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.176308 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") pod \"97baf414-f658-4c71-83a5-a0ecd6d09e90\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.176968 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle\") pod \"3bdced5f-fb99-4010-b386-f944a01767df\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.177014 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts\") pod \"3bdced5f-fb99-4010-b386-f944a01767df\" (UID: \"3bdced5f-fb99-4010-b386-f944a01767df\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.177098 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8jcd\" (UniqueName: \"kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd\") pod \"97baf414-f658-4c71-83a5-a0ecd6d09e90\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.177157 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle\") pod \"97baf414-f658-4c71-83a5-a0ecd6d09e90\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.184530 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts" (OuterVolumeSpecName: "scripts") pod "3bdced5f-fb99-4010-b386-f944a01767df" (UID: "3bdced5f-fb99-4010-b386-f944a01767df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.186859 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd" (OuterVolumeSpecName: "kube-api-access-l8jcd") pod "97baf414-f658-4c71-83a5-a0ecd6d09e90" (UID: "97baf414-f658-4c71-83a5-a0ecd6d09e90"). InnerVolumeSpecName "kube-api-access-l8jcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.187004 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts" (OuterVolumeSpecName: "scripts") pod "97baf414-f658-4c71-83a5-a0ecd6d09e90" (UID: "97baf414-f658-4c71-83a5-a0ecd6d09e90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.188850 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22" (OuterVolumeSpecName: "kube-api-access-vpz22") pod "3bdced5f-fb99-4010-b386-f944a01767df" (UID: "3bdced5f-fb99-4010-b386-f944a01767df"). InnerVolumeSpecName "kube-api-access-vpz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.228562 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97baf414-f658-4c71-83a5-a0ecd6d09e90" (UID: "97baf414-f658-4c71-83a5-a0ecd6d09e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.237700 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data" (OuterVolumeSpecName: "config-data") pod "3bdced5f-fb99-4010-b386-f944a01767df" (UID: "3bdced5f-fb99-4010-b386-f944a01767df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.263787 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bdced5f-fb99-4010-b386-f944a01767df" (UID: "3bdced5f-fb99-4010-b386-f944a01767df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.278639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data" (OuterVolumeSpecName: "config-data") pod "97baf414-f658-4c71-83a5-a0ecd6d09e90" (UID: "97baf414-f658-4c71-83a5-a0ecd6d09e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.278965 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") pod \"97baf414-f658-4c71-83a5-a0ecd6d09e90\" (UID: \"97baf414-f658-4c71-83a5-a0ecd6d09e90\") " Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279445 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279472 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279481 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpz22\" (UniqueName: \"kubernetes.io/projected/3bdced5f-fb99-4010-b386-f944a01767df-kube-api-access-vpz22\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279491 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279499 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bdced5f-fb99-4010-b386-f944a01767df-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279515 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8jcd\" (UniqueName: \"kubernetes.io/projected/97baf414-f658-4c71-83a5-a0ecd6d09e90-kube-api-access-l8jcd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279523 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:17 crc kubenswrapper[5033]: W0319 19:18:17.279605 5033 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/97baf414-f658-4c71-83a5-a0ecd6d09e90/volumes/kubernetes.io~secret/config-data Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.279618 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data" (OuterVolumeSpecName: "config-data") pod "97baf414-f658-4c71-83a5-a0ecd6d09e90" (UID: "97baf414-f658-4c71-83a5-a0ecd6d09e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:17 crc kubenswrapper[5033]: I0319 19:18:17.381777 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97baf414-f658-4c71-83a5-a0ecd6d09e90-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.082530 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-77vvw" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.108762 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wfb9t" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192037 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.192549 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97baf414-f658-4c71-83a5-a0ecd6d09e90" containerName="nova-manage" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192568 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97baf414-f658-4c71-83a5-a0ecd6d09e90" containerName="nova-manage" Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.192606 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e45587-5c12-422b-850f-782805c2169a" containerName="oc" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192614 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e45587-5c12-422b-850f-782805c2169a" containerName="oc" Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.192627 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdced5f-fb99-4010-b386-f944a01767df" containerName="nova-cell1-conductor-db-sync" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192633 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdced5f-fb99-4010-b386-f944a01767df" containerName="nova-cell1-conductor-db-sync" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192842 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e45587-5c12-422b-850f-782805c2169a" containerName="oc" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192867 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdced5f-fb99-4010-b386-f944a01767df" containerName="nova-cell1-conductor-db-sync" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.192888 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97baf414-f658-4c71-83a5-a0ecd6d09e90" containerName="nova-manage" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.193679 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.197490 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.200118 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xhpm\" (UniqueName: \"kubernetes.io/projected/ec25f362-50d7-4ebe-951c-c51868e2485d-kube-api-access-8xhpm\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.200168 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.200201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.204199 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.277519 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.277792 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-log" containerID="cri-o://b5c24e6516126f026933d2df5fa1c622d696f28caae1dcf12d07b6103ab1598f" gracePeriod=30 Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.278284 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-api" containerID="cri-o://01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880" gracePeriod=30 Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.287254 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.287509 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" containerID="cri-o://714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" gracePeriod=30 Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.303532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xhpm\" (UniqueName: \"kubernetes.io/projected/ec25f362-50d7-4ebe-951c-c51868e2485d-kube-api-access-8xhpm\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.303597 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.303633 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.307307 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.313028 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.316868 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.317352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec25f362-50d7-4ebe-951c-c51868e2485d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.320711 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xhpm\" (UniqueName: \"kubernetes.io/projected/ec25f362-50d7-4ebe-951c-c51868e2485d-kube-api-access-8xhpm\") pod \"nova-cell1-conductor-0\" (UID: \"ec25f362-50d7-4ebe-951c-c51868e2485d\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.332101 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:18 crc kubenswrapper[5033]: E0319 19:18:18.332184 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.402599 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.497546 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.498300 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="dnsmasq-dns" containerID="cri-o://cdc2e359cb0222d0101827b6ba4c1c7166da08d5a4193628d49cb624e639a883" gracePeriod=10 Mar 19 19:18:18 crc kubenswrapper[5033]: I0319 19:18:18.530596 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.126036 5033 generic.go:334] "Generic (PLEG): container finished" podID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerID="b5c24e6516126f026933d2df5fa1c622d696f28caae1dcf12d07b6103ab1598f" exitCode=143 Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.126343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerDied","Data":"b5c24e6516126f026933d2df5fa1c622d696f28caae1dcf12d07b6103ab1598f"} Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.160761 5033 generic.go:334] "Generic (PLEG): container finished" podID="75101984-de53-4709-8870-e919c64bfd54" containerID="cdc2e359cb0222d0101827b6ba4c1c7166da08d5a4193628d49cb624e639a883" exitCode=0 Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.160820 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" event={"ID":"75101984-de53-4709-8870-e919c64bfd54","Type":"ContainerDied","Data":"cdc2e359cb0222d0101827b6ba4c1c7166da08d5a4193628d49cb624e639a883"} Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.283747 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.680117 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgvnt\" (UniqueName: \"kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754500 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754626 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.754729 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config\") pod \"75101984-de53-4709-8870-e919c64bfd54\" (UID: \"75101984-de53-4709-8870-e919c64bfd54\") " Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.792876 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt" (OuterVolumeSpecName: "kube-api-access-bgvnt") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "kube-api-access-bgvnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.826148 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config" (OuterVolumeSpecName: "config") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.843088 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.854224 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.854949 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.857348 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.857376 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.857386 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgvnt\" (UniqueName: \"kubernetes.io/projected/75101984-de53-4709-8870-e919c64bfd54-kube-api-access-bgvnt\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.857394 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.857403 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.879772 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75101984-de53-4709-8870-e919c64bfd54" (UID: "75101984-de53-4709-8870-e919c64bfd54"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:19 crc kubenswrapper[5033]: I0319 19:18:19.959803 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75101984-de53-4709-8870-e919c64bfd54-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.172693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec25f362-50d7-4ebe-951c-c51868e2485d","Type":"ContainerStarted","Data":"42d3b5161a66fc6e6da63537340c0ffb933fd7af120c1c82b587d266e3db332b"} Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.172751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec25f362-50d7-4ebe-951c-c51868e2485d","Type":"ContainerStarted","Data":"9455cbd7f567b81086b9505dc35668023bbf488b5df4dab989eb383489506366"} Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.172853 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.175332 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" event={"ID":"75101984-de53-4709-8870-e919c64bfd54","Type":"ContainerDied","Data":"ef9cfde990829b38f7f10334fb10f29a9e5971543ac62030e1a2d4154e57ce9e"} Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.175393 5033 scope.go:117] "RemoveContainer" containerID="cdc2e359cb0222d0101827b6ba4c1c7166da08d5a4193628d49cb624e639a883" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.175397 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-b8sch" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.199443 5033 scope.go:117] "RemoveContainer" containerID="0f15bbdf214a5e7bb690de6c79b53a54ccda584ebce1f9f98545692503ca02bd" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.202976 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.202958076 podStartE2EDuration="2.202958076s" podCreationTimestamp="2026-03-19 19:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:20.188767884 +0000 UTC m=+1310.293797733" watchObservedRunningTime="2026-03-19 19:18:20.202958076 +0000 UTC m=+1310.307987925" Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.221513 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.240934 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-b8sch"] Mar 19 19:18:20 crc kubenswrapper[5033]: I0319 19:18:20.636188 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75101984-de53-4709-8870-e919c64bfd54" path="/var/lib/kubelet/pods/75101984-de53-4709-8870-e919c64bfd54/volumes" Mar 19 19:18:21 crc kubenswrapper[5033]: I0319 19:18:21.138753 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:18:21 crc kubenswrapper[5033]: I0319 19:18:21.138809 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:18:21 crc kubenswrapper[5033]: I0319 19:18:21.458702 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:18:21 crc kubenswrapper[5033]: I0319 19:18:21.458742 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:18:21 crc kubenswrapper[5033]: E0319 19:18:21.709669 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod411fdb53_fe43_4b10_bbf5_099eff379a30.slice/crio-01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod411fdb53_fe43_4b10_bbf5_099eff379a30.slice/crio-conmon-01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.195913 5033 generic.go:334] "Generic (PLEG): container finished" podID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerID="01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880" exitCode=0 Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.195953 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerDied","Data":"01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880"} Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.195977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411fdb53-fe43-4b10-bbf5-099eff379a30","Type":"ContainerDied","Data":"78d873399f5cad589e7912ee3384bd6565a2250335a1b5db3999de195ec25c88"} Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.195989 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d873399f5cad589e7912ee3384bd6565a2250335a1b5db3999de195ec25c88" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.270130 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.416058 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfkrr\" (UniqueName: \"kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr\") pod \"411fdb53-fe43-4b10-bbf5-099eff379a30\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.416127 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs\") pod \"411fdb53-fe43-4b10-bbf5-099eff379a30\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.416208 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle\") pod \"411fdb53-fe43-4b10-bbf5-099eff379a30\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.416257 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data\") pod \"411fdb53-fe43-4b10-bbf5-099eff379a30\" (UID: \"411fdb53-fe43-4b10-bbf5-099eff379a30\") " Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.416782 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs" (OuterVolumeSpecName: "logs") pod "411fdb53-fe43-4b10-bbf5-099eff379a30" (UID: "411fdb53-fe43-4b10-bbf5-099eff379a30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.417553 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411fdb53-fe43-4b10-bbf5-099eff379a30-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.421614 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr" (OuterVolumeSpecName: "kube-api-access-pfkrr") pod "411fdb53-fe43-4b10-bbf5-099eff379a30" (UID: "411fdb53-fe43-4b10-bbf5-099eff379a30"). InnerVolumeSpecName "kube-api-access-pfkrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.453653 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "411fdb53-fe43-4b10-bbf5-099eff379a30" (UID: "411fdb53-fe43-4b10-bbf5-099eff379a30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.458117 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data" (OuterVolumeSpecName: "config-data") pod "411fdb53-fe43-4b10-bbf5-099eff379a30" (UID: "411fdb53-fe43-4b10-bbf5-099eff379a30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.519628 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfkrr\" (UniqueName: \"kubernetes.io/projected/411fdb53-fe43-4b10-bbf5-099eff379a30-kube-api-access-pfkrr\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.519666 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:22 crc kubenswrapper[5033]: I0319 19:18:22.519675 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411fdb53-fe43-4b10-bbf5-099eff379a30-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.213808 5033 generic.go:334] "Generic (PLEG): container finished" podID="38730c07-3023-4319-a997-da522874a9aa" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" exitCode=0 Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.213885 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.214189 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38730c07-3023-4319-a997-da522874a9aa","Type":"ContainerDied","Data":"714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4"} Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.235037 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.269571 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.283515 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.284020 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="dnsmasq-dns" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284044 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="dnsmasq-dns" Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.284060 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="init" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284068 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="init" Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.284080 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-log" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284086 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-log" Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.284112 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-api" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284118 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-api" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284360 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-log" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284379 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="75101984-de53-4709-8870-e919c64bfd54" containerName="dnsmasq-dns" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.284403 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" containerName="nova-api-api" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.285649 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.287906 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.295369 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.300284 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4 is running failed: container process not found" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.300773 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4 is running failed: container process not found" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.301190 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4 is running failed: container process not found" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:18:23 crc kubenswrapper[5033]: E0319 19:18:23.301349 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.440576 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.440655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vct9\" (UniqueName: \"kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.440730 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.440801 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.542564 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.542650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vct9\" (UniqueName: \"kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.542707 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.542772 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.544136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.551105 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.551124 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.563966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vct9\" (UniqueName: \"kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9\") pod \"nova-api-0\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.615698 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.683743 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.864607 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwtm\" (UniqueName: \"kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm\") pod \"38730c07-3023-4319-a997-da522874a9aa\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.865060 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data\") pod \"38730c07-3023-4319-a997-da522874a9aa\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.865128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle\") pod \"38730c07-3023-4319-a997-da522874a9aa\" (UID: \"38730c07-3023-4319-a997-da522874a9aa\") " Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.869630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm" (OuterVolumeSpecName: "kube-api-access-vxwtm") pod "38730c07-3023-4319-a997-da522874a9aa" (UID: "38730c07-3023-4319-a997-da522874a9aa"). InnerVolumeSpecName "kube-api-access-vxwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.909382 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38730c07-3023-4319-a997-da522874a9aa" (UID: "38730c07-3023-4319-a997-da522874a9aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:23 crc kubenswrapper[5033]: I0319 19:18:23.914061 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data" (OuterVolumeSpecName: "config-data") pod "38730c07-3023-4319-a997-da522874a9aa" (UID: "38730c07-3023-4319-a997-da522874a9aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.012081 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxwtm\" (UniqueName: \"kubernetes.io/projected/38730c07-3023-4319-a997-da522874a9aa-kube-api-access-vxwtm\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.012110 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.012119 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38730c07-3023-4319-a997-da522874a9aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:24 crc kubenswrapper[5033]: W0319 19:18:24.097486 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab0c33ca_2456_4d46_8619_c32984db5f81.slice/crio-8a7f43f18784d9aeccea6a43003abf2aace426bf83488500465f962d56aea900 WatchSource:0}: Error finding container 8a7f43f18784d9aeccea6a43003abf2aace426bf83488500465f962d56aea900: Status 404 returned error can't find the container with id 8a7f43f18784d9aeccea6a43003abf2aace426bf83488500465f962d56aea900 Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.099132 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.235764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38730c07-3023-4319-a997-da522874a9aa","Type":"ContainerDied","Data":"9f5ca3f23bd3e80537b871e2b0af5b077e5fb10cb9fa7d2868d1cd8a4736b94c"} Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.236097 5033 scope.go:117] "RemoveContainer" containerID="714f75c3c77507528481e0dc6f87702e38425317100348b7e707648d0ceef0a4" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.235809 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.237878 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerStarted","Data":"8a7f43f18784d9aeccea6a43003abf2aace426bf83488500465f962d56aea900"} Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.272283 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.282367 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.298067 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:24 crc kubenswrapper[5033]: E0319 19:18:24.298607 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.298625 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.298853 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="38730c07-3023-4319-a997-da522874a9aa" containerName="nova-scheduler-scheduler" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.299621 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.301331 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.309733 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.326149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.326297 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.326345 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92j8\" (UniqueName: \"kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.427975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.428141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.428199 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p92j8\" (UniqueName: \"kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.439572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.439711 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.443340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92j8\" (UniqueName: \"kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8\") pod \"nova-scheduler-0\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.615999 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.631603 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38730c07-3023-4319-a997-da522874a9aa" path="/var/lib/kubelet/pods/38730c07-3023-4319-a997-da522874a9aa/volumes" Mar 19 19:18:24 crc kubenswrapper[5033]: I0319 19:18:24.632634 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411fdb53-fe43-4b10-bbf5-099eff379a30" path="/var/lib/kubelet/pods/411fdb53-fe43-4b10-bbf5-099eff379a30/volumes" Mar 19 19:18:25 crc kubenswrapper[5033]: I0319 19:18:25.192435 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:18:25 crc kubenswrapper[5033]: W0319 19:18:25.199376 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc912b748_9cd3_42fd_9d8a_58dca0e5d2fa.slice/crio-17974f4acc88cd96b0629d9799c4d07dde757d5f422e133a4bdd75ad5479a83d WatchSource:0}: Error finding container 17974f4acc88cd96b0629d9799c4d07dde757d5f422e133a4bdd75ad5479a83d: Status 404 returned error can't find the container with id 17974f4acc88cd96b0629d9799c4d07dde757d5f422e133a4bdd75ad5479a83d Mar 19 19:18:25 crc kubenswrapper[5033]: I0319 19:18:25.252987 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerStarted","Data":"8d00c4f64c3bc9598f8e64856e762b7b2bb08e0f56c294cbc2ea05f764f446aa"} Mar 19 19:18:25 crc kubenswrapper[5033]: I0319 19:18:25.253037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerStarted","Data":"cb84ec61b201819952899810510e22b1b2c5d18c4c0f8f0e4621b6272235b66b"} Mar 19 19:18:25 crc kubenswrapper[5033]: I0319 19:18:25.256433 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa","Type":"ContainerStarted","Data":"17974f4acc88cd96b0629d9799c4d07dde757d5f422e133a4bdd75ad5479a83d"} Mar 19 19:18:25 crc kubenswrapper[5033]: I0319 19:18:25.304291 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.304266252 podStartE2EDuration="2.304266252s" podCreationTimestamp="2026-03-19 19:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:25.296722148 +0000 UTC m=+1315.401752017" watchObservedRunningTime="2026-03-19 19:18:25.304266252 +0000 UTC m=+1315.409296101" Mar 19 19:18:26 crc kubenswrapper[5033]: I0319 19:18:26.267406 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa","Type":"ContainerStarted","Data":"091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0"} Mar 19 19:18:26 crc kubenswrapper[5033]: I0319 19:18:26.283495 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.283472606 podStartE2EDuration="2.283472606s" podCreationTimestamp="2026-03-19 19:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:26.280562744 +0000 UTC m=+1316.385592593" watchObservedRunningTime="2026-03-19 19:18:26.283472606 +0000 UTC m=+1316.388502455" Mar 19 19:18:28 crc kubenswrapper[5033]: I0319 19:18:28.558709 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 19:18:29 crc kubenswrapper[5033]: I0319 19:18:29.616252 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:18:33 crc kubenswrapper[5033]: I0319 19:18:33.349536 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:18:33 crc kubenswrapper[5033]: I0319 19:18:33.617283 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:18:33 crc kubenswrapper[5033]: I0319 19:18:33.617624 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:18:34 crc kubenswrapper[5033]: I0319 19:18:34.616888 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:18:34 crc kubenswrapper[5033]: I0319 19:18:34.655166 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:18:34 crc kubenswrapper[5033]: I0319 19:18:34.699682 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:34 crc kubenswrapper[5033]: I0319 19:18:34.699695 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:35 crc kubenswrapper[5033]: I0319 19:18:35.385214 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:18:40 crc kubenswrapper[5033]: I0319 19:18:40.758882 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:18:40 crc kubenswrapper[5033]: I0319 19:18:40.759503 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:18:40 crc kubenswrapper[5033]: I0319 19:18:40.759558 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:18:40 crc kubenswrapper[5033]: I0319 19:18:40.760361 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:18:40 crc kubenswrapper[5033]: I0319 19:18:40.760426 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f" gracePeriod=600 Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.412515 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f" exitCode=0 Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.412698 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f"} Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.412928 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce"} Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.412954 5033 scope.go:117] "RemoveContainer" containerID="8359c8e2cf6fcab67332f777b01998c839489a3f9866bc960369e3e2dc539750" Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.420326 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e57df54-2285-4657-b3aa-4cdaf6a81269" containerID="890b5f8499ee8ffd2c17b58c5564ce2517c835ed6dab6364dc66b90379783541" exitCode=137 Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.420411 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e57df54-2285-4657-b3aa-4cdaf6a81269","Type":"ContainerDied","Data":"890b5f8499ee8ffd2c17b58c5564ce2517c835ed6dab6364dc66b90379783541"} Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.441887 5033 generic.go:334] "Generic (PLEG): container finished" podID="7988b549-babc-47eb-903d-4018cae11462" containerID="ed85073fa542ed90de50fb07d70770740f6af11c7ea66d6b870c30ee7eec070f" exitCode=137 Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.441933 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerDied","Data":"ed85073fa542ed90de50fb07d70770740f6af11c7ea66d6b870c30ee7eec070f"} Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.617189 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:18:41 crc kubenswrapper[5033]: I0319 19:18:41.618206 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.041372 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.054864 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.097130 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvnx5\" (UniqueName: \"kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5\") pod \"8e57df54-2285-4657-b3aa-4cdaf6a81269\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.097398 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data\") pod \"8e57df54-2285-4657-b3aa-4cdaf6a81269\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.097461 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle\") pod \"8e57df54-2285-4657-b3aa-4cdaf6a81269\" (UID: \"8e57df54-2285-4657-b3aa-4cdaf6a81269\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.128375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5" (OuterVolumeSpecName: "kube-api-access-nvnx5") pod "8e57df54-2285-4657-b3aa-4cdaf6a81269" (UID: "8e57df54-2285-4657-b3aa-4cdaf6a81269"). InnerVolumeSpecName "kube-api-access-nvnx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.157204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e57df54-2285-4657-b3aa-4cdaf6a81269" (UID: "8e57df54-2285-4657-b3aa-4cdaf6a81269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.201462 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data\") pod \"7988b549-babc-47eb-903d-4018cae11462\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.201577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs\") pod \"7988b549-babc-47eb-903d-4018cae11462\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.201766 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle\") pod \"7988b549-babc-47eb-903d-4018cae11462\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.201805 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkqzm\" (UniqueName: \"kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm\") pod \"7988b549-babc-47eb-903d-4018cae11462\" (UID: \"7988b549-babc-47eb-903d-4018cae11462\") " Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.202532 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvnx5\" (UniqueName: \"kubernetes.io/projected/8e57df54-2285-4657-b3aa-4cdaf6a81269-kube-api-access-nvnx5\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.202558 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.202518 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs" (OuterVolumeSpecName: "logs") pod "7988b549-babc-47eb-903d-4018cae11462" (UID: "7988b549-babc-47eb-903d-4018cae11462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.224704 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm" (OuterVolumeSpecName: "kube-api-access-hkqzm") pod "7988b549-babc-47eb-903d-4018cae11462" (UID: "7988b549-babc-47eb-903d-4018cae11462"). InnerVolumeSpecName "kube-api-access-hkqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.265571 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data" (OuterVolumeSpecName: "config-data") pod "8e57df54-2285-4657-b3aa-4cdaf6a81269" (UID: "8e57df54-2285-4657-b3aa-4cdaf6a81269"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.265662 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7988b549-babc-47eb-903d-4018cae11462" (UID: "7988b549-babc-47eb-903d-4018cae11462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.266290 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data" (OuterVolumeSpecName: "config-data") pod "7988b549-babc-47eb-903d-4018cae11462" (UID: "7988b549-babc-47eb-903d-4018cae11462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.305769 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e57df54-2285-4657-b3aa-4cdaf6a81269-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.306023 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.306094 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkqzm\" (UniqueName: \"kubernetes.io/projected/7988b549-babc-47eb-903d-4018cae11462-kube-api-access-hkqzm\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.306157 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b549-babc-47eb-903d-4018cae11462-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.306254 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b549-babc-47eb-903d-4018cae11462-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.455924 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.455924 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e57df54-2285-4657-b3aa-4cdaf6a81269","Type":"ContainerDied","Data":"96795bfdd6eaae133315d57d11886328cedd930e9bc3c17c24d95c32599fcc33"} Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.456048 5033 scope.go:117] "RemoveContainer" containerID="890b5f8499ee8ffd2c17b58c5564ce2517c835ed6dab6364dc66b90379783541" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.461209 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7988b549-babc-47eb-903d-4018cae11462","Type":"ContainerDied","Data":"b629cff1fd3714e56e93f5e60ce16264465a1d0f8c58801ec8b9367419760d0b"} Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.461263 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.489836 5033 scope.go:117] "RemoveContainer" containerID="ed85073fa542ed90de50fb07d70770740f6af11c7ea66d6b870c30ee7eec070f" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.511841 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.528264 5033 scope.go:117] "RemoveContainer" containerID="4dc22f34403355d654d452b51f8e00f538ce9efd39f45c0befc4e22534640658" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.528382 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.568097 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.592262 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.606335 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: E0319 19:18:42.606832 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e57df54-2285-4657-b3aa-4cdaf6a81269" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.606853 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e57df54-2285-4657-b3aa-4cdaf6a81269" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:18:42 crc kubenswrapper[5033]: E0319 19:18:42.606917 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-log" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.606927 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-log" Mar 19 19:18:42 crc kubenswrapper[5033]: E0319 19:18:42.606944 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-metadata" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.606952 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-metadata" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.607187 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-metadata" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.607221 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e57df54-2285-4657-b3aa-4cdaf6a81269" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.607240 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7988b549-babc-47eb-903d-4018cae11462" containerName="nova-metadata-log" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.608113 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.613801 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.614181 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.614303 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.619931 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.642706 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7988b549-babc-47eb-903d-4018cae11462" path="/var/lib/kubelet/pods/7988b549-babc-47eb-903d-4018cae11462/volumes" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.643270 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e57df54-2285-4657-b3aa-4cdaf6a81269" path="/var/lib/kubelet/pods/8e57df54-2285-4657-b3aa-4cdaf6a81269/volumes" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.643771 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.645236 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.645317 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.647674 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.656265 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714521 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714590 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714626 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsrc4\" (UniqueName: \"kubernetes.io/projected/5d16067a-8ac2-4949-931e-e874147e40dc-kube-api-access-dsrc4\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714651 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714761 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwkcp\" (UniqueName: \"kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714862 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.714878 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.816922 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.816981 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.817006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.817046 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.817087 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.817718 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsrc4\" (UniqueName: \"kubernetes.io/projected/5d16067a-8ac2-4949-931e-e874147e40dc-kube-api-access-dsrc4\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.817527 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.818132 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.818535 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.818575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.818604 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwkcp\" (UniqueName: \"kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.821033 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.821348 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.821351 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.821720 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.821968 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.822220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.822431 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d16067a-8ac2-4949-931e-e874147e40dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.837006 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsrc4\" (UniqueName: \"kubernetes.io/projected/5d16067a-8ac2-4949-931e-e874147e40dc-kube-api-access-dsrc4\") pod \"nova-cell1-novncproxy-0\" (UID: \"5d16067a-8ac2-4949-931e-e874147e40dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.845912 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwkcp\" (UniqueName: \"kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp\") pod \"nova-metadata-0\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " pod="openstack/nova-metadata-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.925555 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:42 crc kubenswrapper[5033]: I0319 19:18:42.959955 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:18:43 crc kubenswrapper[5033]: W0319 19:18:43.457696 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d16067a_8ac2_4949_931e_e874147e40dc.slice/crio-73e37395bfb6f2dd3be6de69284ea627b9e494342936353b428444453ca00184 WatchSource:0}: Error finding container 73e37395bfb6f2dd3be6de69284ea627b9e494342936353b428444453ca00184: Status 404 returned error can't find the container with id 73e37395bfb6f2dd3be6de69284ea627b9e494342936353b428444453ca00184 Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.466372 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.485829 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5d16067a-8ac2-4949-931e-e874147e40dc","Type":"ContainerStarted","Data":"73e37395bfb6f2dd3be6de69284ea627b9e494342936353b428444453ca00184"} Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.625722 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.626493 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.631309 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:18:43 crc kubenswrapper[5033]: I0319 19:18:43.692039 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.502138 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerStarted","Data":"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc"} Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.502474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerStarted","Data":"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232"} Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.502492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerStarted","Data":"cd0a6d7b30d34b06217e107cb346b2bb4386741654e7ca6a349f9a1f2d4e92d7"} Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.503841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5d16067a-8ac2-4949-931e-e874147e40dc","Type":"ContainerStarted","Data":"05639ddd70c42b7723c4debdea0016d0fcfa32cfd82b8cb0d38ef980213342ff"} Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.508395 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.525818 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.525801108 podStartE2EDuration="2.525801108s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:44.518731848 +0000 UTC m=+1334.623761707" watchObservedRunningTime="2026-03-19 19:18:44.525801108 +0000 UTC m=+1334.630830957" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.548983 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.548964816 podStartE2EDuration="2.548964816s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:44.5445389 +0000 UTC m=+1334.649568749" watchObservedRunningTime="2026-03-19 19:18:44.548964816 +0000 UTC m=+1334.653994665" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.765858 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.767547 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.782039 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.888540 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.888684 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.888877 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.888909 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.888983 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xtn\" (UniqueName: \"kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.889362 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991256 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xtn\" (UniqueName: \"kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991784 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991840 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.991914 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.992061 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.992260 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.992747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.992748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:44 crc kubenswrapper[5033]: I0319 19:18:44.993183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:45 crc kubenswrapper[5033]: I0319 19:18:45.010364 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xtn\" (UniqueName: \"kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn\") pod \"dnsmasq-dns-5fd9b586ff-lcdt2\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:45 crc kubenswrapper[5033]: I0319 19:18:45.094001 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:45 crc kubenswrapper[5033]: I0319 19:18:45.703265 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:18:46 crc kubenswrapper[5033]: I0319 19:18:46.554762 5033 generic.go:334] "Generic (PLEG): container finished" podID="c1caae62-86e1-4c11-8499-52cc408eb399" containerID="33dc49f190eca0bb83f35e5c3d3afaf29cf6ec8cc4e645137c029679bb30e71f" exitCode=0 Mar 19 19:18:46 crc kubenswrapper[5033]: I0319 19:18:46.556799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" event={"ID":"c1caae62-86e1-4c11-8499-52cc408eb399","Type":"ContainerDied","Data":"33dc49f190eca0bb83f35e5c3d3afaf29cf6ec8cc4e645137c029679bb30e71f"} Mar 19 19:18:46 crc kubenswrapper[5033]: I0319 19:18:46.556831 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" event={"ID":"c1caae62-86e1-4c11-8499-52cc408eb399","Type":"ContainerStarted","Data":"252954f0eefb9e55dc0b4bb0d7bca5b5b90023315911b0643c98c86899e54645"} Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.385175 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.567498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" event={"ID":"c1caae62-86e1-4c11-8499-52cc408eb399","Type":"ContainerStarted","Data":"62c2fde7e847f78acffad4cfadc715fe289bcf13871bf818624f3d3a5558f81f"} Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.567756 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-api" containerID="cri-o://8d00c4f64c3bc9598f8e64856e762b7b2bb08e0f56c294cbc2ea05f764f446aa" gracePeriod=30 Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.567624 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-log" containerID="cri-o://cb84ec61b201819952899810510e22b1b2c5d18c4c0f8f0e4621b6272235b66b" gracePeriod=30 Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.616293 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" podStartSLOduration=3.6162763079999998 podStartE2EDuration="3.616276308s" podCreationTimestamp="2026-03-19 19:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:47.599255385 +0000 UTC m=+1337.704285254" watchObservedRunningTime="2026-03-19 19:18:47.616276308 +0000 UTC m=+1337.721306157" Mar 19 19:18:47 crc kubenswrapper[5033]: I0319 19:18:47.926567 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.312419 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.312718 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-central-agent" containerID="cri-o://69777b8be862a94478d21561352106d1a8b69660b3039131b7ac6563f9eaf850" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.312738 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="proxy-httpd" containerID="cri-o://238f55894c528ed48b9b04164f700458dc6ce6e32a8a80d1551a355e68e08e51" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.312839 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-notification-agent" containerID="cri-o://c0f423fd215ccfa1efe693e2b9d558acb629e3fe23d7f385319157763f75b801" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.313002 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="sg-core" containerID="cri-o://70f25493019bf89024c0cb6a9e2a1db1914ee810b49269043aa7bc48c5ff36ce" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.584831 5033 generic.go:334] "Generic (PLEG): container finished" podID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerID="cb84ec61b201819952899810510e22b1b2c5d18c4c0f8f0e4621b6272235b66b" exitCode=143 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.585009 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerDied","Data":"cb84ec61b201819952899810510e22b1b2c5d18c4c0f8f0e4621b6272235b66b"} Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.604911 5033 generic.go:334] "Generic (PLEG): container finished" podID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerID="238f55894c528ed48b9b04164f700458dc6ce6e32a8a80d1551a355e68e08e51" exitCode=0 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.604941 5033 generic.go:334] "Generic (PLEG): container finished" podID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerID="70f25493019bf89024c0cb6a9e2a1db1914ee810b49269043aa7bc48c5ff36ce" exitCode=2 Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.605010 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerDied","Data":"238f55894c528ed48b9b04164f700458dc6ce6e32a8a80d1551a355e68e08e51"} Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.605076 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerDied","Data":"70f25493019bf89024c0cb6a9e2a1db1914ee810b49269043aa7bc48c5ff36ce"} Mar 19 19:18:48 crc kubenswrapper[5033]: I0319 19:18:48.605137 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:49 crc kubenswrapper[5033]: I0319 19:18:49.615374 5033 generic.go:334] "Generic (PLEG): container finished" podID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerID="c0f423fd215ccfa1efe693e2b9d558acb629e3fe23d7f385319157763f75b801" exitCode=0 Mar 19 19:18:49 crc kubenswrapper[5033]: I0319 19:18:49.615404 5033 generic.go:334] "Generic (PLEG): container finished" podID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerID="69777b8be862a94478d21561352106d1a8b69660b3039131b7ac6563f9eaf850" exitCode=0 Mar 19 19:18:49 crc kubenswrapper[5033]: I0319 19:18:49.615410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerDied","Data":"c0f423fd215ccfa1efe693e2b9d558acb629e3fe23d7f385319157763f75b801"} Mar 19 19:18:49 crc kubenswrapper[5033]: I0319 19:18:49.615468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerDied","Data":"69777b8be862a94478d21561352106d1a8b69660b3039131b7ac6563f9eaf850"} Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.183638 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300263 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300542 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300595 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300742 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csx5d\" (UniqueName: \"kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300913 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300935 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.300974 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd\") pod \"285df789-64ed-4a58-a67a-13bb1ce6ed59\" (UID: \"285df789-64ed-4a58-a67a-13bb1ce6ed59\") " Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.301582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.303610 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.316519 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d" (OuterVolumeSpecName: "kube-api-access-csx5d") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "kube-api-access-csx5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.325571 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts" (OuterVolumeSpecName: "scripts") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.376596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.393598 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404368 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csx5d\" (UniqueName: \"kubernetes.io/projected/285df789-64ed-4a58-a67a-13bb1ce6ed59-kube-api-access-csx5d\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404403 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404414 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285df789-64ed-4a58-a67a-13bb1ce6ed59-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404425 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404436 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.404461 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.433832 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.450299 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data" (OuterVolumeSpecName: "config-data") pod "285df789-64ed-4a58-a67a-13bb1ce6ed59" (UID: "285df789-64ed-4a58-a67a-13bb1ce6ed59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.507026 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.507062 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285df789-64ed-4a58-a67a-13bb1ce6ed59-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.657101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285df789-64ed-4a58-a67a-13bb1ce6ed59","Type":"ContainerDied","Data":"ca5fdd4ee473a565342cd736552070e1f9efed135dc253399698d28e4e5392cd"} Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.657151 5033 scope.go:117] "RemoveContainer" containerID="238f55894c528ed48b9b04164f700458dc6ce6e32a8a80d1551a355e68e08e51" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.657293 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.699583 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.701667 5033 scope.go:117] "RemoveContainer" containerID="70f25493019bf89024c0cb6a9e2a1db1914ee810b49269043aa7bc48c5ff36ce" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.722790 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.734116 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:50 crc kubenswrapper[5033]: E0319 19:18:50.741017 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-notification-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741069 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-notification-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: E0319 19:18:50.741098 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="sg-core" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741106 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="sg-core" Mar 19 19:18:50 crc kubenswrapper[5033]: E0319 19:18:50.741137 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-central-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741144 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-central-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: E0319 19:18:50.741168 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="proxy-httpd" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741174 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="proxy-httpd" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741956 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-notification-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.741996 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="sg-core" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.742021 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="proxy-httpd" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.742042 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" containerName="ceilometer-central-agent" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.748523 5033 scope.go:117] "RemoveContainer" containerID="c0f423fd215ccfa1efe693e2b9d558acb629e3fe23d7f385319157763f75b801" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.765476 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.768786 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.769041 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.769228 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.772934 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.804274 5033 scope.go:117] "RemoveContainer" containerID="69777b8be862a94478d21561352106d1a8b69660b3039131b7ac6563f9eaf850" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.915880 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zq57\" (UniqueName: \"kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916687 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916891 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.916934 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:50 crc kubenswrapper[5033]: I0319 19:18:50.917015 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.018846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.018893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.018983 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019064 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019110 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zq57\" (UniqueName: \"kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019150 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.019919 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.020192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.024139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.024207 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.024318 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.024594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.027543 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.038869 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zq57\" (UniqueName: \"kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57\") pod \"ceilometer-0\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.041183 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.042125 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.671142 5033 generic.go:334] "Generic (PLEG): container finished" podID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerID="8d00c4f64c3bc9598f8e64856e762b7b2bb08e0f56c294cbc2ea05f764f446aa" exitCode=0 Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.671225 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerDied","Data":"8d00c4f64c3bc9598f8e64856e762b7b2bb08e0f56c294cbc2ea05f764f446aa"} Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.831080 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:51 crc kubenswrapper[5033]: I0319 19:18:51.944517 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.045792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs\") pod \"ab0c33ca-2456-4d46-8619-c32984db5f81\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.046216 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs" (OuterVolumeSpecName: "logs") pod "ab0c33ca-2456-4d46-8619-c32984db5f81" (UID: "ab0c33ca-2456-4d46-8619-c32984db5f81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.046313 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vct9\" (UniqueName: \"kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9\") pod \"ab0c33ca-2456-4d46-8619-c32984db5f81\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.046375 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data\") pod \"ab0c33ca-2456-4d46-8619-c32984db5f81\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.046426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle\") pod \"ab0c33ca-2456-4d46-8619-c32984db5f81\" (UID: \"ab0c33ca-2456-4d46-8619-c32984db5f81\") " Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.046944 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c33ca-2456-4d46-8619-c32984db5f81-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.057831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9" (OuterVolumeSpecName: "kube-api-access-6vct9") pod "ab0c33ca-2456-4d46-8619-c32984db5f81" (UID: "ab0c33ca-2456-4d46-8619-c32984db5f81"). InnerVolumeSpecName "kube-api-access-6vct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.083611 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0c33ca-2456-4d46-8619-c32984db5f81" (UID: "ab0c33ca-2456-4d46-8619-c32984db5f81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.099599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data" (OuterVolumeSpecName: "config-data") pod "ab0c33ca-2456-4d46-8619-c32984db5f81" (UID: "ab0c33ca-2456-4d46-8619-c32984db5f81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.148944 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vct9\" (UniqueName: \"kubernetes.io/projected/ab0c33ca-2456-4d46-8619-c32984db5f81-kube-api-access-6vct9\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.148985 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.148996 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c33ca-2456-4d46-8619-c32984db5f81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.286630 5033 scope.go:117] "RemoveContainer" containerID="0c0d4dc9d0d61c82b2904e645db0581653effcaf5751d24ff7d46c670c86f5b0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.637888 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285df789-64ed-4a58-a67a-13bb1ce6ed59" path="/var/lib/kubelet/pods/285df789-64ed-4a58-a67a-13bb1ce6ed59/volumes" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.682841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab0c33ca-2456-4d46-8619-c32984db5f81","Type":"ContainerDied","Data":"8a7f43f18784d9aeccea6a43003abf2aace426bf83488500465f962d56aea900"} Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.682864 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.682898 5033 scope.go:117] "RemoveContainer" containerID="8d00c4f64c3bc9598f8e64856e762b7b2bb08e0f56c294cbc2ea05f764f446aa" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.684901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerStarted","Data":"d87e7406e2a5345134ce8880212a0b0491211c71e7425426f4399ea8618a3cca"} Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.684930 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerStarted","Data":"381c8f63955672d16cef7a663cd22d8665918eaa9cb1be0655208236591f8a0e"} Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.712175 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.721260 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.722303 5033 scope.go:117] "RemoveContainer" containerID="cb84ec61b201819952899810510e22b1b2c5d18c4c0f8f0e4621b6272235b66b" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.734219 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:52 crc kubenswrapper[5033]: E0319 19:18:52.734658 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-api" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.734674 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-api" Mar 19 19:18:52 crc kubenswrapper[5033]: E0319 19:18:52.734725 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-log" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.734731 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-log" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.734923 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-log" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.734932 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" containerName="nova-api-api" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.735975 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.738427 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.742677 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.750988 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.752039 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.872324 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.872822 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.872966 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.873098 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.873331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2zb\" (UniqueName: \"kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.873382 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.927416 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.951920 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.961616 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.961654 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.976292 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.976601 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.976756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.977017 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2zb\" (UniqueName: \"kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.977068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.977289 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.977775 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.981578 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.982393 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.983610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:52 crc kubenswrapper[5033]: I0319 19:18:52.984193 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.006620 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2zb\" (UniqueName: \"kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb\") pod \"nova-api-0\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " pod="openstack/nova-api-0" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.067355 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.700931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerStarted","Data":"37f9d5b97ec3abe0d013eb0e500f96e7a05e959b4d94952aad2a817dd8fed722"} Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.735779 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.804655 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.807512 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.821344 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.898714 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57g7\" (UniqueName: \"kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.898837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.898920 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.915266 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jrt6w"] Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.916763 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.926273 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.926773 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.932855 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.941148 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jrt6w"] Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.987492 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:53 crc kubenswrapper[5033]: I0319 19:18:53.987694 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.232:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.001920 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57g7\" (UniqueName: \"kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002140 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002185 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.002256 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blb6s\" (UniqueName: \"kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.003012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.003241 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.030442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57g7\" (UniqueName: \"kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7\") pod \"redhat-operators-45qtt\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.103972 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.104063 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.104093 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blb6s\" (UniqueName: \"kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.104126 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.107278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.107790 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.108126 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.131812 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blb6s\" (UniqueName: \"kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s\") pod \"nova-cell1-cell-mapping-jrt6w\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.147966 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.298212 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.691198 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0c33ca-2456-4d46-8619-c32984db5f81" path="/var/lib/kubelet/pods/ab0c33ca-2456-4d46-8619-c32984db5f81/volumes" Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.746518 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerStarted","Data":"af89577753a9d5be74c6933d2f9f5080bc862d2a2a8c04abe3d3014e91a76813"} Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.746571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerStarted","Data":"b183d070633523f339b88bf4a35db5254b6679513d57ddb64cfc460699e3a663"} Mar 19 19:18:54 crc kubenswrapper[5033]: I0319 19:18:54.760571 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.095635 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.187197 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.188019 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="dnsmasq-dns" containerID="cri-o://3daaeb15559cdd14ae0a13a1ac3b4f2f176eb533ec87731bf198329561e9fd76" gracePeriod=10 Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.255605 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jrt6w"] Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.772840 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jrt6w" event={"ID":"451a5336-669f-41f5-aeb9-0c9db1ba5557","Type":"ContainerStarted","Data":"e5e595383e1e75ea4efae8a39866965a0131ce0fc4b26d07c445ee62e6c1bf99"} Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.775934 5033 generic.go:334] "Generic (PLEG): container finished" podID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerID="3daaeb15559cdd14ae0a13a1ac3b4f2f176eb533ec87731bf198329561e9fd76" exitCode=0 Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.775983 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" event={"ID":"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35","Type":"ContainerDied","Data":"3daaeb15559cdd14ae0a13a1ac3b4f2f176eb533ec87731bf198329561e9fd76"} Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.781061 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerStarted","Data":"74d1efb6ee86a2bd002a7692b1ef1c88f8536e778963a6f10d29c0ba44791655"} Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.782249 5033 generic.go:334] "Generic (PLEG): container finished" podID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerID="e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464" exitCode=0 Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.782286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerDied","Data":"e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464"} Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.782299 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerStarted","Data":"630116de335554a280330ba636f41218c6775c4898985b0bcf730e3254a8f11b"} Mar 19 19:18:55 crc kubenswrapper[5033]: I0319 19:18:55.806886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerStarted","Data":"07f4a379355d646ed834ec3d608fc5edbed479ffa3624cd83506ed4319512a62"} Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.724540 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.755499 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.755479896 podStartE2EDuration="4.755479896s" podCreationTimestamp="2026-03-19 19:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:55.843776036 +0000 UTC m=+1345.948805885" watchObservedRunningTime="2026-03-19 19:18:56.755479896 +0000 UTC m=+1346.860509745" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798279 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798400 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798513 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798640 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798666 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.798724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9v72\" (UniqueName: \"kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72\") pod \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\" (UID: \"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35\") " Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.804584 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72" (OuterVolumeSpecName: "kube-api-access-h9v72") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "kube-api-access-h9v72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.831902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jrt6w" event={"ID":"451a5336-669f-41f5-aeb9-0c9db1ba5557","Type":"ContainerStarted","Data":"332084d4eace736cc48044ca5babb2d7fced82ca3b44c054162404b3a02ce78e"} Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.836173 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.836909 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-8rq6r" event={"ID":"4d9b55a9-7443-4a4f-ad5a-7eca2995cf35","Type":"ContainerDied","Data":"984581f8fb999b1d5dfd3809202be1d3e6a3c7efcde1529186bc65d479543cc1"} Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.836955 5033 scope.go:117] "RemoveContainer" containerID="3daaeb15559cdd14ae0a13a1ac3b4f2f176eb533ec87731bf198329561e9fd76" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.867495 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jrt6w" podStartSLOduration=3.867475313 podStartE2EDuration="3.867475313s" podCreationTimestamp="2026-03-19 19:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:56.854591938 +0000 UTC m=+1346.959621787" watchObservedRunningTime="2026-03-19 19:18:56.867475313 +0000 UTC m=+1346.972505162" Mar 19 19:18:56 crc kubenswrapper[5033]: I0319 19:18:56.902703 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9v72\" (UniqueName: \"kubernetes.io/projected/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-kube-api-access-h9v72\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.038579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.055613 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.065308 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.070954 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.072098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config" (OuterVolumeSpecName: "config") pod "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" (UID: "4d9b55a9-7443-4a4f-ad5a-7eca2995cf35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.103020 5033 scope.go:117] "RemoveContainer" containerID="0b083343ad67b6a25ee23998bfd6a2acda3d55515c740d00efe84ae16f3c283d" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.105557 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.105583 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.105596 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.105610 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.105620 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.193460 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.208243 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-8rq6r"] Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.846181 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerStarted","Data":"820213de6b735b27fabb7f0fd0ab21aa9de9e75653a0a0b20a93581935bc67a5"} Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.846676 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.846727 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="sg-core" containerID="cri-o://74d1efb6ee86a2bd002a7692b1ef1c88f8536e778963a6f10d29c0ba44791655" gracePeriod=30 Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.846842 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="proxy-httpd" containerID="cri-o://820213de6b735b27fabb7f0fd0ab21aa9de9e75653a0a0b20a93581935bc67a5" gracePeriod=30 Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.846838 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-notification-agent" containerID="cri-o://37f9d5b97ec3abe0d013eb0e500f96e7a05e959b4d94952aad2a817dd8fed722" gracePeriod=30 Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.847510 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-central-agent" containerID="cri-o://d87e7406e2a5345134ce8880212a0b0491211c71e7425426f4399ea8618a3cca" gracePeriod=30 Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.850412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerStarted","Data":"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1"} Mar 19 19:18:57 crc kubenswrapper[5033]: I0319 19:18:57.912388 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.947051021 podStartE2EDuration="7.91237084s" podCreationTimestamp="2026-03-19 19:18:50 +0000 UTC" firstStartedPulling="2026-03-19 19:18:51.83150934 +0000 UTC m=+1341.936539189" lastFinishedPulling="2026-03-19 19:18:56.796829159 +0000 UTC m=+1346.901859008" observedRunningTime="2026-03-19 19:18:57.884361486 +0000 UTC m=+1347.989391335" watchObservedRunningTime="2026-03-19 19:18:57.91237084 +0000 UTC m=+1348.017400689" Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.635981 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" path="/var/lib/kubelet/pods/4d9b55a9-7443-4a4f-ad5a-7eca2995cf35/volumes" Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879432 5033 generic.go:334] "Generic (PLEG): container finished" podID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerID="820213de6b735b27fabb7f0fd0ab21aa9de9e75653a0a0b20a93581935bc67a5" exitCode=0 Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879509 5033 generic.go:334] "Generic (PLEG): container finished" podID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerID="74d1efb6ee86a2bd002a7692b1ef1c88f8536e778963a6f10d29c0ba44791655" exitCode=2 Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879521 5033 generic.go:334] "Generic (PLEG): container finished" podID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerID="37f9d5b97ec3abe0d013eb0e500f96e7a05e959b4d94952aad2a817dd8fed722" exitCode=0 Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879493 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerDied","Data":"820213de6b735b27fabb7f0fd0ab21aa9de9e75653a0a0b20a93581935bc67a5"} Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879626 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerDied","Data":"74d1efb6ee86a2bd002a7692b1ef1c88f8536e778963a6f10d29c0ba44791655"} Mar 19 19:18:58 crc kubenswrapper[5033]: I0319 19:18:58.879639 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerDied","Data":"37f9d5b97ec3abe0d013eb0e500f96e7a05e959b4d94952aad2a817dd8fed722"} Mar 19 19:19:00 crc kubenswrapper[5033]: I0319 19:19:00.902101 5033 generic.go:334] "Generic (PLEG): container finished" podID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerID="d87e7406e2a5345134ce8880212a0b0491211c71e7425426f4399ea8618a3cca" exitCode=0 Mar 19 19:19:00 crc kubenswrapper[5033]: I0319 19:19:00.902182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerDied","Data":"d87e7406e2a5345134ce8880212a0b0491211c71e7425426f4399ea8618a3cca"} Mar 19 19:19:00 crc kubenswrapper[5033]: I0319 19:19:00.961186 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:19:00 crc kubenswrapper[5033]: I0319 19:19:00.961251 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.886748 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913243 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zq57\" (UniqueName: \"kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913523 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913544 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.913671 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle\") pod \"98eb0a84-f09c-46ee-bf72-d33857612f67\" (UID: \"98eb0a84-f09c-46ee-bf72-d33857612f67\") " Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.914214 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.914348 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.915352 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.927158 5033 generic.go:334] "Generic (PLEG): container finished" podID="451a5336-669f-41f5-aeb9-0c9db1ba5557" containerID="332084d4eace736cc48044ca5babb2d7fced82ca3b44c054162404b3a02ce78e" exitCode=0 Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.927320 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jrt6w" event={"ID":"451a5336-669f-41f5-aeb9-0c9db1ba5557","Type":"ContainerDied","Data":"332084d4eace736cc48044ca5babb2d7fced82ca3b44c054162404b3a02ce78e"} Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.933668 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts" (OuterVolumeSpecName: "scripts") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.944751 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57" (OuterVolumeSpecName: "kube-api-access-6zq57") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "kube-api-access-6zq57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.948625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98eb0a84-f09c-46ee-bf72-d33857612f67","Type":"ContainerDied","Data":"381c8f63955672d16cef7a663cd22d8665918eaa9cb1be0655208236591f8a0e"} Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.948676 5033 scope.go:117] "RemoveContainer" containerID="820213de6b735b27fabb7f0fd0ab21aa9de9e75653a0a0b20a93581935bc67a5" Mar 19 19:19:01 crc kubenswrapper[5033]: I0319 19:19:01.948815 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.002352 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.016581 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.016631 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98eb0a84-f09c-46ee-bf72-d33857612f67-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.016642 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zq57\" (UniqueName: \"kubernetes.io/projected/98eb0a84-f09c-46ee-bf72-d33857612f67-kube-api-access-6zq57\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.016651 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.033198 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.092745 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.107532 5033 scope.go:117] "RemoveContainer" containerID="74d1efb6ee86a2bd002a7692b1ef1c88f8536e778963a6f10d29c0ba44791655" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.110021 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data" (OuterVolumeSpecName: "config-data") pod "98eb0a84-f09c-46ee-bf72-d33857612f67" (UID: "98eb0a84-f09c-46ee-bf72-d33857612f67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.118739 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.118762 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.118771 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98eb0a84-f09c-46ee-bf72-d33857612f67-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.166158 5033 scope.go:117] "RemoveContainer" containerID="37f9d5b97ec3abe0d013eb0e500f96e7a05e959b4d94952aad2a817dd8fed722" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.186120 5033 scope.go:117] "RemoveContainer" containerID="d87e7406e2a5345134ce8880212a0b0491211c71e7425426f4399ea8618a3cca" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.316672 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.346051 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.362920 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.364702 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="proxy-httpd" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.364724 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="proxy-httpd" Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.364984 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="dnsmasq-dns" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365006 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="dnsmasq-dns" Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.365025 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="sg-core" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365034 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="sg-core" Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.365067 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-notification-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365074 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-notification-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.365271 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="init" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365283 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="init" Mar 19 19:19:02 crc kubenswrapper[5033]: E0319 19:19:02.365304 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-central-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365312 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-central-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.365995 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="sg-core" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.366030 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9b55a9-7443-4a4f-ad5a-7eca2995cf35" containerName="dnsmasq-dns" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.366051 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="proxy-httpd" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.366063 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-central-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.366083 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" containerName="ceilometer-notification-agent" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.371122 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.379428 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.379683 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.379914 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.386210 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432221 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432250 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432286 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432354 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrzq\" (UniqueName: \"kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.432399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.535872 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.535992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536017 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536053 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536157 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrzq\" (UniqueName: \"kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536197 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.536737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.537308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.539280 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.539989 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.540871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.541459 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.542092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.562979 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrzq\" (UniqueName: \"kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq\") pod \"ceilometer-0\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.630443 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98eb0a84-f09c-46ee-bf72-d33857612f67" path="/var/lib/kubelet/pods/98eb0a84-f09c-46ee-bf72-d33857612f67/volumes" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.693046 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.965970 5033 generic.go:334] "Generic (PLEG): container finished" podID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerID="a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1" exitCode=0 Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.966271 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerDied","Data":"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1"} Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.970756 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.970880 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.978830 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:19:02 crc kubenswrapper[5033]: I0319 19:19:02.984215 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.068144 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.069118 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.415838 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.811270 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.871914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data\") pod \"451a5336-669f-41f5-aeb9-0c9db1ba5557\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.872313 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blb6s\" (UniqueName: \"kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s\") pod \"451a5336-669f-41f5-aeb9-0c9db1ba5557\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.872432 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts\") pod \"451a5336-669f-41f5-aeb9-0c9db1ba5557\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.872585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle\") pod \"451a5336-669f-41f5-aeb9-0c9db1ba5557\" (UID: \"451a5336-669f-41f5-aeb9-0c9db1ba5557\") " Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.878219 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts" (OuterVolumeSpecName: "scripts") pod "451a5336-669f-41f5-aeb9-0c9db1ba5557" (UID: "451a5336-669f-41f5-aeb9-0c9db1ba5557"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.881575 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s" (OuterVolumeSpecName: "kube-api-access-blb6s") pod "451a5336-669f-41f5-aeb9-0c9db1ba5557" (UID: "451a5336-669f-41f5-aeb9-0c9db1ba5557"). InnerVolumeSpecName "kube-api-access-blb6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.912115 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "451a5336-669f-41f5-aeb9-0c9db1ba5557" (UID: "451a5336-669f-41f5-aeb9-0c9db1ba5557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.920156 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data" (OuterVolumeSpecName: "config-data") pod "451a5336-669f-41f5-aeb9-0c9db1ba5557" (UID: "451a5336-669f-41f5-aeb9-0c9db1ba5557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.974861 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blb6s\" (UniqueName: \"kubernetes.io/projected/451a5336-669f-41f5-aeb9-0c9db1ba5557-kube-api-access-blb6s\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.975176 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.975186 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.975194 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a5336-669f-41f5-aeb9-0c9db1ba5557-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.986851 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerStarted","Data":"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf"} Mar 19 19:19:03 crc kubenswrapper[5033]: I0319 19:19:03.994003 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerStarted","Data":"140537ce2952b4ce24ba201fad0a9d1d2893f9bd66907b0973fc0e2b839fba8f"} Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.000560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jrt6w" event={"ID":"451a5336-669f-41f5-aeb9-0c9db1ba5557","Type":"ContainerDied","Data":"e5e595383e1e75ea4efae8a39866965a0131ce0fc4b26d07c445ee62e6c1bf99"} Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.000592 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e595383e1e75ea4efae8a39866965a0131ce0fc4b26d07c445ee62e6c1bf99" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.000754 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jrt6w" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.015273 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-45qtt" podStartSLOduration=3.377865465 podStartE2EDuration="11.015253445s" podCreationTimestamp="2026-03-19 19:18:53 +0000 UTC" firstStartedPulling="2026-03-19 19:18:55.790175426 +0000 UTC m=+1345.895205275" lastFinishedPulling="2026-03-19 19:19:03.427563406 +0000 UTC m=+1353.532593255" observedRunningTime="2026-03-19 19:19:04.009477632 +0000 UTC m=+1354.114507481" watchObservedRunningTime="2026-03-19 19:19:04.015253445 +0000 UTC m=+1354.120283294" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.109652 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.133215 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.147964 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.148150 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerName="nova-scheduler-scheduler" containerID="cri-o://091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" gracePeriod=30 Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.148511 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.149881 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.150665 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:04 crc kubenswrapper[5033]: I0319 19:19:04.224661 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:04 crc kubenswrapper[5033]: E0319 19:19:04.619391 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:19:04 crc kubenswrapper[5033]: E0319 19:19:04.620947 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:19:04 crc kubenswrapper[5033]: E0319 19:19:04.622281 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:19:04 crc kubenswrapper[5033]: E0319 19:19:04.622330 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerName="nova-scheduler-scheduler" Mar 19 19:19:05 crc kubenswrapper[5033]: I0319 19:19:05.011160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerStarted","Data":"ba82dbfb43cd1baaca044efc8a06f2a61a9d7d0a351a2620d3c307fa8a7fac05"} Mar 19 19:19:05 crc kubenswrapper[5033]: I0319 19:19:05.011634 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-log" containerID="cri-o://af89577753a9d5be74c6933d2f9f5080bc862d2a2a8c04abe3d3014e91a76813" gracePeriod=30 Mar 19 19:19:05 crc kubenswrapper[5033]: I0319 19:19:05.011726 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-api" containerID="cri-o://07f4a379355d646ed834ec3d608fc5edbed479ffa3624cd83506ed4319512a62" gracePeriod=30 Mar 19 19:19:05 crc kubenswrapper[5033]: I0319 19:19:05.035724 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:19:05 crc kubenswrapper[5033]: I0319 19:19:05.195581 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-45qtt" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="registry-server" probeResult="failure" output=< Mar 19 19:19:05 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:19:05 crc kubenswrapper[5033]: > Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.022071 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerID="af89577753a9d5be74c6933d2f9f5080bc862d2a2a8c04abe3d3014e91a76813" exitCode=143 Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.022163 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerDied","Data":"af89577753a9d5be74c6933d2f9f5080bc862d2a2a8c04abe3d3014e91a76813"} Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.025511 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerStarted","Data":"f970c5201223e3d3c7312219a6f43ba8fb3cd7a4eb0ca8305b5137af9110a3b8"} Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.025563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerStarted","Data":"4d8eae01895b8e6db05e4fa3efbc08069d1c3f562fe703adc76a2d8d9a3ac2d1"} Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.025742 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-log" containerID="cri-o://0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232" gracePeriod=30 Mar 19 19:19:06 crc kubenswrapper[5033]: I0319 19:19:06.025855 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-metadata" containerID="cri-o://7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc" gracePeriod=30 Mar 19 19:19:07 crc kubenswrapper[5033]: I0319 19:19:07.034818 5033 generic.go:334] "Generic (PLEG): container finished" podID="c948df4a-0867-410d-924a-e87fc08fc052" containerID="0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232" exitCode=143 Mar 19 19:19:07 crc kubenswrapper[5033]: I0319 19:19:07.034857 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerDied","Data":"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232"} Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.055919 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerStarted","Data":"4def1d7de4d3a2f0dd1e3cd39747c2018841819f2b6d4a53be64773c9cb646a9"} Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.056512 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.058056 5033 generic.go:334] "Generic (PLEG): container finished" podID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerID="091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" exitCode=0 Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.058096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa","Type":"ContainerDied","Data":"091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0"} Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.075345 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.337154276 podStartE2EDuration="7.075331061s" podCreationTimestamp="2026-03-19 19:19:02 +0000 UTC" firstStartedPulling="2026-03-19 19:19:03.426144936 +0000 UTC m=+1353.531174785" lastFinishedPulling="2026-03-19 19:19:08.164321721 +0000 UTC m=+1358.269351570" observedRunningTime="2026-03-19 19:19:09.073985903 +0000 UTC m=+1359.179015762" watchObservedRunningTime="2026-03-19 19:19:09.075331061 +0000 UTC m=+1359.180360910" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.508922 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.616353 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle\") pod \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.616571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data\") pod \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.616724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p92j8\" (UniqueName: \"kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8\") pod \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\" (UID: \"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa\") " Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.631170 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8" (OuterVolumeSpecName: "kube-api-access-p92j8") pod "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" (UID: "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa"). InnerVolumeSpecName "kube-api-access-p92j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.632581 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p92j8\" (UniqueName: \"kubernetes.io/projected/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-kube-api-access-p92j8\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.699016 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data" (OuterVolumeSpecName: "config-data") pod "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" (UID: "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.701068 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" (UID: "c912b748-9cd3-42fd-9d8a-58dca0e5d2fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.734986 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.735020 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:09 crc kubenswrapper[5033]: I0319 19:19:09.918324 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.039625 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwkcp\" (UniqueName: \"kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp\") pod \"c948df4a-0867-410d-924a-e87fc08fc052\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.039691 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs\") pod \"c948df4a-0867-410d-924a-e87fc08fc052\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.039818 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data\") pod \"c948df4a-0867-410d-924a-e87fc08fc052\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.039952 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle\") pod \"c948df4a-0867-410d-924a-e87fc08fc052\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.040063 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs\") pod \"c948df4a-0867-410d-924a-e87fc08fc052\" (UID: \"c948df4a-0867-410d-924a-e87fc08fc052\") " Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.040485 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs" (OuterVolumeSpecName: "logs") pod "c948df4a-0867-410d-924a-e87fc08fc052" (UID: "c948df4a-0867-410d-924a-e87fc08fc052"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.040597 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948df4a-0867-410d-924a-e87fc08fc052-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.043840 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp" (OuterVolumeSpecName: "kube-api-access-kwkcp") pod "c948df4a-0867-410d-924a-e87fc08fc052" (UID: "c948df4a-0867-410d-924a-e87fc08fc052"). InnerVolumeSpecName "kube-api-access-kwkcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.087039 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data" (OuterVolumeSpecName: "config-data") pod "c948df4a-0867-410d-924a-e87fc08fc052" (UID: "c948df4a-0867-410d-924a-e87fc08fc052"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.087460 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.087573 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c912b748-9cd3-42fd-9d8a-58dca0e5d2fa","Type":"ContainerDied","Data":"17974f4acc88cd96b0629d9799c4d07dde757d5f422e133a4bdd75ad5479a83d"} Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.087619 5033 scope.go:117] "RemoveContainer" containerID="091769071dbf75a25c4c0d996ca08a0b4b128d398351ed53f0ef5b0e3d7e74f0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.093661 5033 generic.go:334] "Generic (PLEG): container finished" podID="c948df4a-0867-410d-924a-e87fc08fc052" containerID="7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc" exitCode=0 Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.094763 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.095212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerDied","Data":"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc"} Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.095236 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c948df4a-0867-410d-924a-e87fc08fc052","Type":"ContainerDied","Data":"cd0a6d7b30d34b06217e107cb346b2bb4386741654e7ca6a349f9a1f2d4e92d7"} Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.102563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c948df4a-0867-410d-924a-e87fc08fc052" (UID: "c948df4a-0867-410d-924a-e87fc08fc052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.118750 5033 scope.go:117] "RemoveContainer" containerID="7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.137766 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c948df4a-0867-410d-924a-e87fc08fc052" (UID: "c948df4a-0867-410d-924a-e87fc08fc052"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.142692 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwkcp\" (UniqueName: \"kubernetes.io/projected/c948df4a-0867-410d-924a-e87fc08fc052-kube-api-access-kwkcp\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.142718 5033 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.142728 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.142741 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948df4a-0867-410d-924a-e87fc08fc052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.156728 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.161744 5033 scope.go:117] "RemoveContainer" containerID="0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.175993 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.190836 5033 scope.go:117] "RemoveContainer" containerID="7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc" Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.191176 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc\": container with ID starting with 7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc not found: ID does not exist" containerID="7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.191206 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc"} err="failed to get container status \"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc\": rpc error: code = NotFound desc = could not find container \"7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc\": container with ID starting with 7c5f9c66b23e561e56e67b0f867bd4f5269627e2bbb2065c82674877cf852adc not found: ID does not exist" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.191227 5033 scope.go:117] "RemoveContainer" containerID="0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232" Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.192070 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232\": container with ID starting with 0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232 not found: ID does not exist" containerID="0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.192110 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232"} err="failed to get container status \"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232\": rpc error: code = NotFound desc = could not find container \"0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232\": container with ID starting with 0fdbb3fa726d3b5aca7064f3deeccbb03dec04531d171dcfd3104447e1bc7232 not found: ID does not exist" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.193389 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.193783 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-metadata" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.193799 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-metadata" Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.193829 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerName="nova-scheduler-scheduler" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.193836 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerName="nova-scheduler-scheduler" Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.193844 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451a5336-669f-41f5-aeb9-0c9db1ba5557" containerName="nova-manage" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.193853 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="451a5336-669f-41f5-aeb9-0c9db1ba5557" containerName="nova-manage" Mar 19 19:19:10 crc kubenswrapper[5033]: E0319 19:19:10.193872 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-log" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.193880 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-log" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.194070 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-metadata" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.194100 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="451a5336-669f-41f5-aeb9-0c9db1ba5557" containerName="nova-manage" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.194110 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" containerName="nova-scheduler-scheduler" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.194122 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c948df4a-0867-410d-924a-e87fc08fc052" containerName="nova-metadata-log" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.194875 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.198778 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.219481 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.346968 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-config-data\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.347277 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrck\" (UniqueName: \"kubernetes.io/projected/be847d21-69a1-4299-a333-94d99e6af513-kube-api-access-dfrck\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.347493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.429619 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.439625 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.449694 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrck\" (UniqueName: \"kubernetes.io/projected/be847d21-69a1-4299-a333-94d99e6af513-kube-api-access-dfrck\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.449958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.450152 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-config-data\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.452206 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.453924 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.454278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.454524 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be847d21-69a1-4299-a333-94d99e6af513-config-data\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.462104 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.462440 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.467083 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.493156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrck\" (UniqueName: \"kubernetes.io/projected/be847d21-69a1-4299-a333-94d99e6af513-kube-api-access-dfrck\") pod \"nova-scheduler-0\" (UID: \"be847d21-69a1-4299-a333-94d99e6af513\") " pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.519949 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.555511 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjn2\" (UniqueName: \"kubernetes.io/projected/24d06bd8-50d5-4da0-9041-41f401c1c4fd-kube-api-access-rkjn2\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.555830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-config-data\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.556045 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d06bd8-50d5-4da0-9041-41f401c1c4fd-logs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.556149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.556212 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.654429 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c912b748-9cd3-42fd-9d8a-58dca0e5d2fa" path="/var/lib/kubelet/pods/c912b748-9cd3-42fd-9d8a-58dca0e5d2fa/volumes" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.655786 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c948df4a-0867-410d-924a-e87fc08fc052" path="/var/lib/kubelet/pods/c948df4a-0867-410d-924a-e87fc08fc052/volumes" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.658068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d06bd8-50d5-4da0-9041-41f401c1c4fd-logs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.658140 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.658163 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.658185 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjn2\" (UniqueName: \"kubernetes.io/projected/24d06bd8-50d5-4da0-9041-41f401c1c4fd-kube-api-access-rkjn2\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.658216 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-config-data\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.659742 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24d06bd8-50d5-4da0-9041-41f401c1c4fd-logs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.666433 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.668589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-config-data\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.675170 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d06bd8-50d5-4da0-9041-41f401c1c4fd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.675177 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjn2\" (UniqueName: \"kubernetes.io/projected/24d06bd8-50d5-4da0-9041-41f401c1c4fd-kube-api-access-rkjn2\") pod \"nova-metadata-0\" (UID: \"24d06bd8-50d5-4da0-9041-41f401c1c4fd\") " pod="openstack/nova-metadata-0" Mar 19 19:19:10 crc kubenswrapper[5033]: I0319 19:19:10.795219 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.068147 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.068195 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.104371 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerID="07f4a379355d646ed834ec3d608fc5edbed479ffa3624cd83506ed4319512a62" exitCode=0 Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.104479 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerDied","Data":"07f4a379355d646ed834ec3d608fc5edbed479ffa3624cd83506ed4319512a62"} Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.350416 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.702530 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.811758 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.812097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.812178 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.812240 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.812315 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.812338 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz2zb\" (UniqueName: \"kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb\") pod \"3b579f97-ad23-4171-b4b2-b8ee782d5787\" (UID: \"3b579f97-ad23-4171-b4b2-b8ee782d5787\") " Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.813067 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs" (OuterVolumeSpecName: "logs") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.818032 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb" (OuterVolumeSpecName: "kube-api-access-wz2zb") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "kube-api-access-wz2zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.840739 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.850667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data" (OuterVolumeSpecName: "config-data") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.857845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.884954 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.893995 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b579f97-ad23-4171-b4b2-b8ee782d5787" (UID: "3b579f97-ad23-4171-b4b2-b8ee782d5787"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914734 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914778 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz2zb\" (UniqueName: \"kubernetes.io/projected/3b579f97-ad23-4171-b4b2-b8ee782d5787-kube-api-access-wz2zb\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914797 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914810 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914823 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b579f97-ad23-4171-b4b2-b8ee782d5787-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:11 crc kubenswrapper[5033]: I0319 19:19:11.914834 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b579f97-ad23-4171-b4b2-b8ee782d5787-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.123410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be847d21-69a1-4299-a333-94d99e6af513","Type":"ContainerStarted","Data":"02d7016c419719893194b14b682b1071a253210ea402a0a04106060fa35f3fa4"} Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.123467 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be847d21-69a1-4299-a333-94d99e6af513","Type":"ContainerStarted","Data":"4f61083453af14ebc60e7b7bc8460ada7ffdd078033edbbc9408d8886a69bf37"} Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.132342 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b579f97-ad23-4171-b4b2-b8ee782d5787","Type":"ContainerDied","Data":"b183d070633523f339b88bf4a35db5254b6679513d57ddb64cfc460699e3a663"} Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.132400 5033 scope.go:117] "RemoveContainer" containerID="07f4a379355d646ed834ec3d608fc5edbed479ffa3624cd83506ed4319512a62" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.132586 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.154664 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"24d06bd8-50d5-4da0-9041-41f401c1c4fd","Type":"ContainerStarted","Data":"2c0748c1854246a3cf6125a560eba2c827fc1471169b4d2f4f6260564eded65f"} Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.154772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"24d06bd8-50d5-4da0-9041-41f401c1c4fd","Type":"ContainerStarted","Data":"07e38924b4234ae61f4f90cd1584a5bdffbaa1d503f0bb0015736417488f5d96"} Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.156905 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.156882057 podStartE2EDuration="2.156882057s" podCreationTimestamp="2026-03-19 19:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:12.145104723 +0000 UTC m=+1362.250134572" watchObservedRunningTime="2026-03-19 19:19:12.156882057 +0000 UTC m=+1362.261911906" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.179655 5033 scope.go:117] "RemoveContainer" containerID="af89577753a9d5be74c6933d2f9f5080bc862d2a2a8c04abe3d3014e91a76813" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.184609 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.206696 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.224853 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:12 crc kubenswrapper[5033]: E0319 19:19:12.225572 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-api" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.225588 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-api" Mar 19 19:19:12 crc kubenswrapper[5033]: E0319 19:19:12.225631 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-log" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.225638 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-log" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.225828 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-log" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.225848 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" containerName="nova-api-api" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.226946 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.229671 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.229871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.229915 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.238321 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.323719 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-config-data\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.323786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdwd\" (UniqueName: \"kubernetes.io/projected/49ced7d3-72c5-4255-a657-9f14d7f2b656-kube-api-access-lkdwd\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.323972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-public-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.324037 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ced7d3-72c5-4255-a657-9f14d7f2b656-logs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.324101 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.324424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-config-data\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdwd\" (UniqueName: \"kubernetes.io/projected/49ced7d3-72c5-4255-a657-9f14d7f2b656-kube-api-access-lkdwd\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426760 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-public-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426793 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ced7d3-72c5-4255-a657-9f14d7f2b656-logs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.426886 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.427639 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ced7d3-72c5-4255-a657-9f14d7f2b656-logs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.431777 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-public-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.432806 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.433376 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.442733 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ced7d3-72c5-4255-a657-9f14d7f2b656-config-data\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.446858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdwd\" (UniqueName: \"kubernetes.io/projected/49ced7d3-72c5-4255-a657-9f14d7f2b656-kube-api-access-lkdwd\") pod \"nova-api-0\" (UID: \"49ced7d3-72c5-4255-a657-9f14d7f2b656\") " pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.543164 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:19:12 crc kubenswrapper[5033]: I0319 19:19:12.639566 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b579f97-ad23-4171-b4b2-b8ee782d5787" path="/var/lib/kubelet/pods/3b579f97-ad23-4171-b4b2-b8ee782d5787/volumes" Mar 19 19:19:13 crc kubenswrapper[5033]: I0319 19:19:13.005907 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:19:13 crc kubenswrapper[5033]: W0319 19:19:13.008361 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ced7d3_72c5_4255_a657_9f14d7f2b656.slice/crio-af704ddafe03e0fddadff9a49216c97395a21620476215856207fee0efe587a0 WatchSource:0}: Error finding container af704ddafe03e0fddadff9a49216c97395a21620476215856207fee0efe587a0: Status 404 returned error can't find the container with id af704ddafe03e0fddadff9a49216c97395a21620476215856207fee0efe587a0 Mar 19 19:19:13 crc kubenswrapper[5033]: I0319 19:19:13.167698 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49ced7d3-72c5-4255-a657-9f14d7f2b656","Type":"ContainerStarted","Data":"af704ddafe03e0fddadff9a49216c97395a21620476215856207fee0efe587a0"} Mar 19 19:19:13 crc kubenswrapper[5033]: I0319 19:19:13.179297 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"24d06bd8-50d5-4da0-9041-41f401c1c4fd","Type":"ContainerStarted","Data":"fdee63ffa2df1a3d8c5f0418c3b21ea92e10fb63ffad2f03d27ab568701541cb"} Mar 19 19:19:13 crc kubenswrapper[5033]: I0319 19:19:13.211603 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.211580523 podStartE2EDuration="3.211580523s" podCreationTimestamp="2026-03-19 19:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:13.200556521 +0000 UTC m=+1363.305586370" watchObservedRunningTime="2026-03-19 19:19:13.211580523 +0000 UTC m=+1363.316610392" Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.193154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49ced7d3-72c5-4255-a657-9f14d7f2b656","Type":"ContainerStarted","Data":"e18993a0f3d848a1cf9bc2513dd6c03dddee809406905b477b581b65afe63aa0"} Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.193558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49ced7d3-72c5-4255-a657-9f14d7f2b656","Type":"ContainerStarted","Data":"e39a6568ca69fae6fb3b9708208886dc40a89bab6eba758374681aea084d5ef1"} Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.198558 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.216248 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.21622419 podStartE2EDuration="2.21622419s" podCreationTimestamp="2026-03-19 19:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:14.211254529 +0000 UTC m=+1364.316284388" watchObservedRunningTime="2026-03-19 19:19:14.21622419 +0000 UTC m=+1364.321254039" Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.258372 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:14 crc kubenswrapper[5033]: I0319 19:19:14.457579 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:19:15 crc kubenswrapper[5033]: I0319 19:19:15.520584 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:19:16 crc kubenswrapper[5033]: I0319 19:19:16.210747 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-45qtt" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="registry-server" containerID="cri-o://cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf" gracePeriod=2 Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.067910 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.220957 5033 generic.go:334] "Generic (PLEG): container finished" podID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerID="cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf" exitCode=0 Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.221017 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerDied","Data":"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf"} Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.221048 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45qtt" event={"ID":"52d9390d-a917-4b2a-ba2f-a724c95db757","Type":"ContainerDied","Data":"630116de335554a280330ba636f41218c6775c4898985b0bcf730e3254a8f11b"} Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.221066 5033 scope.go:117] "RemoveContainer" containerID="cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.221343 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45qtt" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.231633 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g57g7\" (UniqueName: \"kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7\") pod \"52d9390d-a917-4b2a-ba2f-a724c95db757\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.231909 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities\") pod \"52d9390d-a917-4b2a-ba2f-a724c95db757\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.232013 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content\") pod \"52d9390d-a917-4b2a-ba2f-a724c95db757\" (UID: \"52d9390d-a917-4b2a-ba2f-a724c95db757\") " Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.241442 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities" (OuterVolumeSpecName: "utilities") pod "52d9390d-a917-4b2a-ba2f-a724c95db757" (UID: "52d9390d-a917-4b2a-ba2f-a724c95db757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.242381 5033 scope.go:117] "RemoveContainer" containerID="a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.248646 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7" (OuterVolumeSpecName: "kube-api-access-g57g7") pod "52d9390d-a917-4b2a-ba2f-a724c95db757" (UID: "52d9390d-a917-4b2a-ba2f-a724c95db757"). InnerVolumeSpecName "kube-api-access-g57g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.324621 5033 scope.go:117] "RemoveContainer" containerID="e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.335799 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.335829 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g57g7\" (UniqueName: \"kubernetes.io/projected/52d9390d-a917-4b2a-ba2f-a724c95db757-kube-api-access-g57g7\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.364906 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52d9390d-a917-4b2a-ba2f-a724c95db757" (UID: "52d9390d-a917-4b2a-ba2f-a724c95db757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.380945 5033 scope.go:117] "RemoveContainer" containerID="cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf" Mar 19 19:19:17 crc kubenswrapper[5033]: E0319 19:19:17.381413 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf\": container with ID starting with cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf not found: ID does not exist" containerID="cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.381509 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf"} err="failed to get container status \"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf\": rpc error: code = NotFound desc = could not find container \"cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf\": container with ID starting with cf66f5e4a2c3f3565258d99ef8e254b79f410ee68b7e0bdbfdb4447512788fbf not found: ID does not exist" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.381545 5033 scope.go:117] "RemoveContainer" containerID="a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1" Mar 19 19:19:17 crc kubenswrapper[5033]: E0319 19:19:17.381916 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1\": container with ID starting with a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1 not found: ID does not exist" containerID="a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.381949 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1"} err="failed to get container status \"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1\": rpc error: code = NotFound desc = could not find container \"a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1\": container with ID starting with a084869df3aba2a4a5c1412b5def0b986a81e8737a10dc4ae7c60fb48ab1aff1 not found: ID does not exist" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.381967 5033 scope.go:117] "RemoveContainer" containerID="e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464" Mar 19 19:19:17 crc kubenswrapper[5033]: E0319 19:19:17.382254 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464\": container with ID starting with e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464 not found: ID does not exist" containerID="e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.382294 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464"} err="failed to get container status \"e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464\": rpc error: code = NotFound desc = could not find container \"e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464\": container with ID starting with e10d584a356d157db484ef14b9f1b733baddbde4d24c530bc19ac35a0ec19464 not found: ID does not exist" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.437306 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d9390d-a917-4b2a-ba2f-a724c95db757-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.554707 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:19:17 crc kubenswrapper[5033]: I0319 19:19:17.566656 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-45qtt"] Mar 19 19:19:18 crc kubenswrapper[5033]: I0319 19:19:18.635057 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" path="/var/lib/kubelet/pods/52d9390d-a917-4b2a-ba2f-a724c95db757/volumes" Mar 19 19:19:20 crc kubenswrapper[5033]: I0319 19:19:20.520463 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:19:20 crc kubenswrapper[5033]: I0319 19:19:20.568384 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:19:20 crc kubenswrapper[5033]: I0319 19:19:20.795622 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:19:20 crc kubenswrapper[5033]: I0319 19:19:20.795668 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:19:21 crc kubenswrapper[5033]: I0319 19:19:21.331172 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:19:21 crc kubenswrapper[5033]: I0319 19:19:21.809724 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="24d06bd8-50d5-4da0-9041-41f401c1c4fd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.240:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:21 crc kubenswrapper[5033]: I0319 19:19:21.809680 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="24d06bd8-50d5-4da0-9041-41f401c1c4fd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.240:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:22 crc kubenswrapper[5033]: I0319 19:19:22.543780 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:19:22 crc kubenswrapper[5033]: I0319 19:19:22.544098 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:19:23 crc kubenswrapper[5033]: I0319 19:19:23.557593 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49ced7d3-72c5-4255-a657-9f14d7f2b656" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.241:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:23 crc kubenswrapper[5033]: I0319 19:19:23.557632 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49ced7d3-72c5-4255-a657-9f14d7f2b656" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.241:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:28 crc kubenswrapper[5033]: I0319 19:19:28.796235 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:19:28 crc kubenswrapper[5033]: I0319 19:19:28.797835 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:19:30 crc kubenswrapper[5033]: I0319 19:19:30.543717 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:19:30 crc kubenswrapper[5033]: I0319 19:19:30.544057 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:19:30 crc kubenswrapper[5033]: I0319 19:19:30.802121 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:19:30 crc kubenswrapper[5033]: I0319 19:19:30.805861 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:19:30 crc kubenswrapper[5033]: I0319 19:19:30.808001 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:19:31 crc kubenswrapper[5033]: I0319 19:19:31.393929 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:19:32 crc kubenswrapper[5033]: I0319 19:19:32.552819 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:19:32 crc kubenswrapper[5033]: I0319 19:19:32.556924 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:19:32 crc kubenswrapper[5033]: I0319 19:19:32.561242 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:19:32 crc kubenswrapper[5033]: I0319 19:19:32.712921 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:19:33 crc kubenswrapper[5033]: I0319 19:19:33.408284 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.021534 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-jjb4h"] Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.033837 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-jjb4h"] Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.128583 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-9lpbv"] Mar 19 19:19:44 crc kubenswrapper[5033]: E0319 19:19:44.128961 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="registry-server" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.128980 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="registry-server" Mar 19 19:19:44 crc kubenswrapper[5033]: E0319 19:19:44.129012 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="extract-content" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.129021 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="extract-content" Mar 19 19:19:44 crc kubenswrapper[5033]: E0319 19:19:44.129034 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="extract-utilities" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.129041 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="extract-utilities" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.129224 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d9390d-a917-4b2a-ba2f-a724c95db757" containerName="registry-server" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.129886 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.135168 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.152497 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9lpbv"] Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.210063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.210108 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vs2\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.210167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.210238 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.210283 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.312145 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.312190 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vs2\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.312252 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.312334 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.312378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.321397 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.321824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.322085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.322682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.335491 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vs2\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2\") pod \"cloudkitty-db-sync-9lpbv\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.456939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:19:44 crc kubenswrapper[5033]: I0319 19:19:44.633042 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412949e0-343a-45ad-873c-ff40cecb82de" path="/var/lib/kubelet/pods/412949e0-343a-45ad-873c-ff40cecb82de/volumes" Mar 19 19:19:45 crc kubenswrapper[5033]: I0319 19:19:45.006706 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9lpbv"] Mar 19 19:19:45 crc kubenswrapper[5033]: I0319 19:19:45.556094 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9lpbv" event={"ID":"9fb5bcaa-3619-4584-b125-1f3d521ffb2c","Type":"ContainerStarted","Data":"4cb0e36e658673ca4598a9e762325d22f725b9015e2de1942b521f37558a4da3"} Mar 19 19:19:45 crc kubenswrapper[5033]: I0319 19:19:45.723478 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.385816 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.386082 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-central-agent" containerID="cri-o://ba82dbfb43cd1baaca044efc8a06f2a61a9d7d0a351a2620d3c307fa8a7fac05" gracePeriod=30 Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.386154 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="sg-core" containerID="cri-o://f970c5201223e3d3c7312219a6f43ba8fb3cd7a4eb0ca8305b5137af9110a3b8" gracePeriod=30 Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.386234 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-notification-agent" containerID="cri-o://4d8eae01895b8e6db05e4fa3efbc08069d1c3f562fe703adc76a2d8d9a3ac2d1" gracePeriod=30 Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.386207 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="proxy-httpd" containerID="cri-o://4def1d7de4d3a2f0dd1e3cd39747c2018841819f2b6d4a53be64773c9cb646a9" gracePeriod=30 Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.578440 5033 generic.go:334] "Generic (PLEG): container finished" podID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerID="f970c5201223e3d3c7312219a6f43ba8fb3cd7a4eb0ca8305b5137af9110a3b8" exitCode=2 Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.578494 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerDied","Data":"f970c5201223e3d3c7312219a6f43ba8fb3cd7a4eb0ca8305b5137af9110a3b8"} Mar 19 19:19:46 crc kubenswrapper[5033]: I0319 19:19:46.816426 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:19:47 crc kubenswrapper[5033]: I0319 19:19:47.596570 5033 generic.go:334] "Generic (PLEG): container finished" podID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerID="4def1d7de4d3a2f0dd1e3cd39747c2018841819f2b6d4a53be64773c9cb646a9" exitCode=0 Mar 19 19:19:47 crc kubenswrapper[5033]: I0319 19:19:47.596829 5033 generic.go:334] "Generic (PLEG): container finished" podID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerID="ba82dbfb43cd1baaca044efc8a06f2a61a9d7d0a351a2620d3c307fa8a7fac05" exitCode=0 Mar 19 19:19:47 crc kubenswrapper[5033]: I0319 19:19:47.596652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerDied","Data":"4def1d7de4d3a2f0dd1e3cd39747c2018841819f2b6d4a53be64773c9cb646a9"} Mar 19 19:19:47 crc kubenswrapper[5033]: I0319 19:19:47.596869 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerDied","Data":"ba82dbfb43cd1baaca044efc8a06f2a61a9d7d0a351a2620d3c307fa8a7fac05"} Mar 19 19:19:49 crc kubenswrapper[5033]: I0319 19:19:49.643088 5033 generic.go:334] "Generic (PLEG): container finished" podID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerID="4d8eae01895b8e6db05e4fa3efbc08069d1c3f562fe703adc76a2d8d9a3ac2d1" exitCode=0 Mar 19 19:19:49 crc kubenswrapper[5033]: I0319 19:19:49.643428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerDied","Data":"4d8eae01895b8e6db05e4fa3efbc08069d1c3f562fe703adc76a2d8d9a3ac2d1"} Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.228605 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341737 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341824 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341900 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341953 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.341995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccrzq\" (UniqueName: \"kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.342086 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs\") pod \"70055567-d2a2-424c-9f2a-a407d9253f5e\" (UID: \"70055567-d2a2-424c-9f2a-a407d9253f5e\") " Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.342272 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.342358 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.342990 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.343012 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70055567-d2a2-424c-9f2a-a407d9253f5e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.391372 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts" (OuterVolumeSpecName: "scripts") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.409663 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq" (OuterVolumeSpecName: "kube-api-access-ccrzq") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "kube-api-access-ccrzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.445950 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.445981 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccrzq\" (UniqueName: \"kubernetes.io/projected/70055567-d2a2-424c-9f2a-a407d9253f5e-kube-api-access-ccrzq\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.446133 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.493433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.539675 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="rabbitmq" containerID="cri-o://1a36b31f845e1c0781461b231449786077a087754d63d07145620aa63ac84c39" gracePeriod=604796 Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.547480 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.547691 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.556882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.597875 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data" (OuterVolumeSpecName: "config-data") pod "70055567-d2a2-424c-9f2a-a407d9253f5e" (UID: "70055567-d2a2-424c-9f2a-a407d9253f5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.650854 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.650890 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70055567-d2a2-424c-9f2a-a407d9253f5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.662792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70055567-d2a2-424c-9f2a-a407d9253f5e","Type":"ContainerDied","Data":"140537ce2952b4ce24ba201fad0a9d1d2893f9bd66907b0973fc0e2b839fba8f"} Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.662840 5033 scope.go:117] "RemoveContainer" containerID="4def1d7de4d3a2f0dd1e3cd39747c2018841819f2b6d4a53be64773c9cb646a9" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.662865 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.688771 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.706745 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.711788 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:50 crc kubenswrapper[5033]: E0319 19:19:50.712182 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="sg-core" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712201 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="sg-core" Mar 19 19:19:50 crc kubenswrapper[5033]: E0319 19:19:50.712233 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-notification-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712240 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-notification-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: E0319 19:19:50.712249 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-central-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712255 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-central-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: E0319 19:19:50.712267 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="proxy-httpd" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712272 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="proxy-httpd" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712476 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-notification-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712496 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="ceilometer-central-agent" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712508 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="sg-core" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.712522 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" containerName="proxy-httpd" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.714996 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.718730 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.718823 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.718963 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.727095 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.750852 5033 scope.go:117] "RemoveContainer" containerID="f970c5201223e3d3c7312219a6f43ba8fb3cd7a4eb0ca8305b5137af9110a3b8" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.817248 5033 scope.go:117] "RemoveContainer" containerID="4d8eae01895b8e6db05e4fa3efbc08069d1c3f562fe703adc76a2d8d9a3ac2d1" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.841941 5033 scope.go:117] "RemoveContainer" containerID="ba82dbfb43cd1baaca044efc8a06f2a61a9d7d0a351a2620d3c307fa8a7fac05" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.855968 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-run-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856054 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-config-data\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856100 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-scripts\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856414 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856553 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9hj\" (UniqueName: \"kubernetes.io/projected/2084df77-794b-44e4-92a5-16ccb442b1ee-kube-api-access-dl9hj\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.856694 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-log-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959045 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-run-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959118 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-config-data\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959145 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-scripts\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959214 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959255 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9hj\" (UniqueName: \"kubernetes.io/projected/2084df77-794b-44e4-92a5-16ccb442b1ee-kube-api-access-dl9hj\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959308 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-log-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.959735 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-log-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.961271 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2084df77-794b-44e4-92a5-16ccb442b1ee-run-httpd\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.963851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.964198 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-config-data\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.964253 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.965042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-scripts\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.967101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2084df77-794b-44e4-92a5-16ccb442b1ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:50 crc kubenswrapper[5033]: I0319 19:19:50.975386 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9hj\" (UniqueName: \"kubernetes.io/projected/2084df77-794b-44e4-92a5-16ccb442b1ee-kube-api-access-dl9hj\") pod \"ceilometer-0\" (UID: \"2084df77-794b-44e4-92a5-16ccb442b1ee\") " pod="openstack/ceilometer-0" Mar 19 19:19:51 crc kubenswrapper[5033]: I0319 19:19:51.056073 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:51 crc kubenswrapper[5033]: I0319 19:19:51.569654 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:51 crc kubenswrapper[5033]: W0319 19:19:51.579797 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2084df77_794b_44e4_92a5_16ccb442b1ee.slice/crio-a2b38948a5bc71382f774be334067763d98e4d01a008cf5d9c5eabb00b49d7d3 WatchSource:0}: Error finding container a2b38948a5bc71382f774be334067763d98e4d01a008cf5d9c5eabb00b49d7d3: Status 404 returned error can't find the container with id a2b38948a5bc71382f774be334067763d98e4d01a008cf5d9c5eabb00b49d7d3 Mar 19 19:19:51 crc kubenswrapper[5033]: I0319 19:19:51.678554 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2084df77-794b-44e4-92a5-16ccb442b1ee","Type":"ContainerStarted","Data":"a2b38948a5bc71382f774be334067763d98e4d01a008cf5d9c5eabb00b49d7d3"} Mar 19 19:19:51 crc kubenswrapper[5033]: I0319 19:19:51.691323 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="rabbitmq" containerID="cri-o://41fb3582d1232a277241178fe8b3c077cce7be9055ded5f3e80f154971416e95" gracePeriod=604796 Mar 19 19:19:52 crc kubenswrapper[5033]: I0319 19:19:52.638069 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70055567-d2a2-424c-9f2a-a407d9253f5e" path="/var/lib/kubelet/pods/70055567-d2a2-424c-9f2a-a407d9253f5e/volumes" Mar 19 19:19:56 crc kubenswrapper[5033]: I0319 19:19:56.470158 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Mar 19 19:19:56 crc kubenswrapper[5033]: I0319 19:19:56.501082 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Mar 19 19:19:57 crc kubenswrapper[5033]: I0319 19:19:57.778498 5033 generic.go:334] "Generic (PLEG): container finished" podID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerID="1a36b31f845e1c0781461b231449786077a087754d63d07145620aa63ac84c39" exitCode=0 Mar 19 19:19:57 crc kubenswrapper[5033]: I0319 19:19:57.778609 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerDied","Data":"1a36b31f845e1c0781461b231449786077a087754d63d07145620aa63ac84c39"} Mar 19 19:19:58 crc kubenswrapper[5033]: I0319 19:19:58.838549 5033 generic.go:334] "Generic (PLEG): container finished" podID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerID="41fb3582d1232a277241178fe8b3c077cce7be9055ded5f3e80f154971416e95" exitCode=0 Mar 19 19:19:58 crc kubenswrapper[5033]: I0319 19:19:58.838597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerDied","Data":"41fb3582d1232a277241178fe8b3c077cce7be9055ded5f3e80f154971416e95"} Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.129662 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565800-ctlbb"] Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.131765 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.133961 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.134206 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.134343 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.141873 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-ctlbb"] Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.266786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6cgg\" (UniqueName: \"kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg\") pod \"auto-csr-approver-29565800-ctlbb\" (UID: \"f1c6e8dd-3762-416c-af0e-99322b5561c3\") " pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.371356 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6cgg\" (UniqueName: \"kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg\") pod \"auto-csr-approver-29565800-ctlbb\" (UID: \"f1c6e8dd-3762-416c-af0e-99322b5561c3\") " pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.409106 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6cgg\" (UniqueName: \"kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg\") pod \"auto-csr-approver-29565800-ctlbb\" (UID: \"f1c6e8dd-3762-416c-af0e-99322b5561c3\") " pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.409171 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.411142 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.419835 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.424863 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.450898 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff2jr\" (UniqueName: \"kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473436 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473504 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473536 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473622 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.473672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.577798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.577905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.577974 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff2jr\" (UniqueName: \"kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.578042 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.578101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.578138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.578201 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.578764 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.579178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.579308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.579837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.579862 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.580579 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.596278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff2jr\" (UniqueName: \"kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr\") pod \"dnsmasq-dns-dbb88bf8c-6jx7j\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:00 crc kubenswrapper[5033]: I0319 19:20:00.775019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.204536 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295094 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295570 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295615 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7ks\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295675 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295709 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295754 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295797 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295832 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.295938 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.296005 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.296410 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.296660 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.298377 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.298911 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\" (UID: \"ee035802-be9d-40dc-9f6c-3cb58bcb13d6\") " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.299878 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.299905 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.299919 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.306117 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.310113 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.316831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks" (OuterVolumeSpecName: "kube-api-access-sd7ks") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "kube-api-access-sd7ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.348604 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info" (OuterVolumeSpecName: "pod-info") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.349868 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9" (OuterVolumeSpecName: "persistence") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.393818 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data" (OuterVolumeSpecName: "config-data") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401890 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd7ks\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-kube-api-access-sd7ks\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401918 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401928 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401941 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401949 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.401971 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") on node \"crc\" " Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.508558 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf" (OuterVolumeSpecName: "server-conf") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.576205 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.609797 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.625794 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9") on node "crc" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.709877 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ee035802-be9d-40dc-9f6c-3cb58bcb13d6" (UID: "ee035802-be9d-40dc-9f6c-3cb58bcb13d6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.711215 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee035802-be9d-40dc-9f6c-3cb58bcb13d6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.711230 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.872152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ee035802-be9d-40dc-9f6c-3cb58bcb13d6","Type":"ContainerDied","Data":"7f5335cab0e0df384bd865e9a0f45ff1e546c5b2da410a11e80b6bc521c9b3e7"} Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.872201 5033 scope.go:117] "RemoveContainer" containerID="1a36b31f845e1c0781461b231449786077a087754d63d07145620aa63ac84c39" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.872274 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.914551 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.944785 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.975924 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:20:01 crc kubenswrapper[5033]: E0319 19:20:01.976736 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="setup-container" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.976756 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="setup-container" Mar 19 19:20:01 crc kubenswrapper[5033]: E0319 19:20:01.976775 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="rabbitmq" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.976783 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="rabbitmq" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.977079 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" containerName="rabbitmq" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.978839 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993548 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993827 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-77h7k" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993612 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993702 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993696 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993739 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.993740 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 19:20:01 crc kubenswrapper[5033]: I0319 19:20:01.998405 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.016839 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.016871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-config-data\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.016887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.016904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c343ed29-14b7-4363-a055-7b540ee2ea31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017083 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017188 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017234 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017255 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6vg\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-kube-api-access-kb6vg\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017297 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017336 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c343ed29-14b7-4363-a055-7b540ee2ea31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.017391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120019 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120116 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6vg\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-kube-api-access-kb6vg\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c343ed29-14b7-4363-a055-7b540ee2ea31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120229 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120352 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120389 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-config-data\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c343ed29-14b7-4363-a055-7b540ee2ea31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.120551 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.121265 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.121391 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.121889 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-config-data\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.123408 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c343ed29-14b7-4363-a055-7b540ee2ea31-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.125743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c343ed29-14b7-4363-a055-7b540ee2ea31-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.126249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.126259 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.132907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c343ed29-14b7-4363-a055-7b540ee2ea31-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.138777 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.138829 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb05dd1e0a4f02545007a3008c7d9e3f987cb58a1dd67affb0d78d89041635cc/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.147086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6vg\" (UniqueName: \"kubernetes.io/projected/c343ed29-14b7-4363-a055-7b540ee2ea31-kube-api-access-kb6vg\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.218163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed61f12-935a-49a9-85c3-f13f9500e8f9\") pod \"rabbitmq-server-0\" (UID: \"c343ed29-14b7-4363-a055-7b540ee2ea31\") " pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.312026 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:20:02 crc kubenswrapper[5033]: I0319 19:20:02.642138 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee035802-be9d-40dc-9f6c-3cb58bcb13d6" path="/var/lib/kubelet/pods/ee035802-be9d-40dc-9f6c-3cb58bcb13d6/volumes" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.687072 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.792421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.792934 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vggq\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.792993 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.793077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.793117 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.794940 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.795607 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.795800 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.795928 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.795983 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.796004 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.796043 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.796097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls\") pod \"3fd2f356-f4a3-4256-905e-581b33d3a974\" (UID: \"3fd2f356-f4a3-4256-905e-581b33d3a974\") " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.797110 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.797129 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.804586 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq" (OuterVolumeSpecName: "kube-api-access-7vggq") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "kube-api-access-7vggq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.808003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.809284 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.810274 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info" (OuterVolumeSpecName: "pod-info") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.840085 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data" (OuterVolumeSpecName: "config-data") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.850977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.851393 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832" (OuterVolumeSpecName: "persistence") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "pvc-aa174056-2e5a-4234-85bc-6be4779f9832". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900161 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") on node \"crc\" " Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900199 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fd2f356-f4a3-4256-905e-581b33d3a974-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900209 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900218 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fd2f356-f4a3-4256-905e-581b33d3a974-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900228 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900236 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vggq\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-kube-api-access-7vggq\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.900244 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.914157 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf" (OuterVolumeSpecName: "server-conf") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.968436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fd2f356-f4a3-4256-905e-581b33d3a974","Type":"ContainerDied","Data":"f514a423ef066900ac6fdcd134a22ed9d17b28ab9c8f9b8183cb3b96ffdea77c"} Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.968622 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.977418 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.977594 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-aa174056-2e5a-4234-85bc-6be4779f9832" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832") on node "crc" Mar 19 19:20:05 crc kubenswrapper[5033]: I0319 19:20:05.983213 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3fd2f356-f4a3-4256-905e-581b33d3a974" (UID: "3fd2f356-f4a3-4256-905e-581b33d3a974"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.004715 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fd2f356-f4a3-4256-905e-581b33d3a974-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.004749 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.004760 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fd2f356-f4a3-4256-905e-581b33d3a974-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.308027 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.319572 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.340927 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:20:06 crc kubenswrapper[5033]: E0319 19:20:06.341348 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="rabbitmq" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.341364 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="rabbitmq" Mar 19 19:20:06 crc kubenswrapper[5033]: E0319 19:20:06.341390 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="setup-container" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.341396 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="setup-container" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.341606 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" containerName="rabbitmq" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.342658 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.347878 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.347999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.348048 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.348096 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.348264 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.348407 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xgb79" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.350269 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.356519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.514780 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.514865 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8dad24e-7338-41b9-b008-f3dd1c68d3de-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.514912 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.514948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515006 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515183 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnwh\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-kube-api-access-xbnwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515334 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515420 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.515566 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8dad24e-7338-41b9-b008-f3dd1c68d3de-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.617644 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.617735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8dad24e-7338-41b9-b008-f3dd1c68d3de-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.617788 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.617831 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.617910 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.618146 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.619233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.622820 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.623139 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c2787f9d79718a101190626a8fac4e044cb69ec532c273cd4d0472dce50cb36/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.623177 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.623538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.624019 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8dad24e-7338-41b9-b008-f3dd1c68d3de-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628437 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628534 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnwh\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-kube-api-access-xbnwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628662 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.628913 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8dad24e-7338-41b9-b008-f3dd1c68d3de-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.629898 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8dad24e-7338-41b9-b008-f3dd1c68d3de-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.630213 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.630369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8dad24e-7338-41b9-b008-f3dd1c68d3de-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.635607 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8dad24e-7338-41b9-b008-f3dd1c68d3de-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.641702 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd2f356-f4a3-4256-905e-581b33d3a974" path="/var/lib/kubelet/pods/3fd2f356-f4a3-4256-905e-581b33d3a974/volumes" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.650776 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnwh\" (UniqueName: \"kubernetes.io/projected/f8dad24e-7338-41b9-b008-f3dd1c68d3de-kube-api-access-xbnwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.698008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aa174056-2e5a-4234-85bc-6be4779f9832\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aa174056-2e5a-4234-85bc-6be4779f9832\") pod \"rabbitmq-cell1-server-0\" (UID: \"f8dad24e-7338-41b9-b008-f3dd1c68d3de\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:06 crc kubenswrapper[5033]: I0319 19:20:06.963757 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:10 crc kubenswrapper[5033]: I0319 19:20:10.957066 5033 scope.go:117] "RemoveContainer" containerID="4a087ee37539a6611aa3f7f2252e9f01369edf2d536cb5388780ab9ac0c88afc" Mar 19 19:20:11 crc kubenswrapper[5033]: E0319 19:20:11.596000 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 19:20:11 crc kubenswrapper[5033]: E0319 19:20:11.596062 5033 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 19:20:11 crc kubenswrapper[5033]: E0319 19:20:11.596177 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c9h544hd7h5f9h599h589h645h586h677h5bdh65ch55bh5dhdbh5bbh554h5fch66fh685hcch85h589hddh5h59fh56dh5bbhf9h5dch87h569h677q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dl9hj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(2084df77-794b-44e4-92a5-16ccb442b1ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:20:12 crc kubenswrapper[5033]: E0319 19:20:12.246861 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 19 19:20:12 crc kubenswrapper[5033]: E0319 19:20:12.247146 5033 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 19 19:20:12 crc kubenswrapper[5033]: E0319 19:20:12.247259 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7vs2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-9lpbv_openstack(9fb5bcaa-3619-4584-b125-1f3d521ffb2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:20:12 crc kubenswrapper[5033]: E0319 19:20:12.248581 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-9lpbv" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" Mar 19 19:20:12 crc kubenswrapper[5033]: I0319 19:20:12.294679 5033 scope.go:117] "RemoveContainer" containerID="41fb3582d1232a277241178fe8b3c077cce7be9055ded5f3e80f154971416e95" Mar 19 19:20:12 crc kubenswrapper[5033]: I0319 19:20:12.539530 5033 scope.go:117] "RemoveContainer" containerID="a9c60b46b5a9fb57d068e917fa1f0d393fc4d26de4b8ab813a914a8467d5a93d" Mar 19 19:20:13 crc kubenswrapper[5033]: I0319 19:20:13.022882 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:13 crc kubenswrapper[5033]: I0319 19:20:13.044433 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-ctlbb"] Mar 19 19:20:13 crc kubenswrapper[5033]: I0319 19:20:13.064914 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" event={"ID":"2ac9c98f-3855-4940-bb9e-0b4b87831509","Type":"ContainerStarted","Data":"98cac2981a1793e7333f1667bc3832704cf5d19f3d6ab41a30c61f242388a42a"} Mar 19 19:20:13 crc kubenswrapper[5033]: E0319 19:20:13.065582 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-9lpbv" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" Mar 19 19:20:13 crc kubenswrapper[5033]: I0319 19:20:13.202964 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:20:13 crc kubenswrapper[5033]: W0319 19:20:13.210077 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc343ed29_14b7_4363_a055_7b540ee2ea31.slice/crio-4213cc61d26c190daa8cc88d8561349bce894e3963c6a4cd7bd84905a868b0a9 WatchSource:0}: Error finding container 4213cc61d26c190daa8cc88d8561349bce894e3963c6a4cd7bd84905a868b0a9: Status 404 returned error can't find the container with id 4213cc61d26c190daa8cc88d8561349bce894e3963c6a4cd7bd84905a868b0a9 Mar 19 19:20:13 crc kubenswrapper[5033]: W0319 19:20:13.422860 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dad24e_7338_41b9_b008_f3dd1c68d3de.slice/crio-d48eeecb063ecce81645bba062a0e1076bbcb00e8010018227fc14d7810fbdc6 WatchSource:0}: Error finding container d48eeecb063ecce81645bba062a0e1076bbcb00e8010018227fc14d7810fbdc6: Status 404 returned error can't find the container with id d48eeecb063ecce81645bba062a0e1076bbcb00e8010018227fc14d7810fbdc6 Mar 19 19:20:13 crc kubenswrapper[5033]: I0319 19:20:13.429098 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.075887 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2084df77-794b-44e4-92a5-16ccb442b1ee","Type":"ContainerStarted","Data":"2ab29c42274984daa0dddd7d94f8623dad34db4ecb5f7328a4cc371270d5d468"} Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.075939 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2084df77-794b-44e4-92a5-16ccb442b1ee","Type":"ContainerStarted","Data":"6f79ff62447c7bccc44c28f675b48c6c647972d1399c8754d319be630fb2490f"} Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.077274 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c343ed29-14b7-4363-a055-7b540ee2ea31","Type":"ContainerStarted","Data":"4213cc61d26c190daa8cc88d8561349bce894e3963c6a4cd7bd84905a868b0a9"} Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.078422 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f8dad24e-7338-41b9-b008-f3dd1c68d3de","Type":"ContainerStarted","Data":"d48eeecb063ecce81645bba062a0e1076bbcb00e8010018227fc14d7810fbdc6"} Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.079789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" event={"ID":"f1c6e8dd-3762-416c-af0e-99322b5561c3","Type":"ContainerStarted","Data":"14264f7d7e5dcb328ed4fd4a66afec123dc4321ff14a17830d69eb44aa250f0d"} Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.081949 5033 generic.go:334] "Generic (PLEG): container finished" podID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerID="b7f240c3082309541ac89bcc748ccc600b24b3c0ed37c1518efcf356e147d1c2" exitCode=0 Mar 19 19:20:14 crc kubenswrapper[5033]: I0319 19:20:14.081987 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" event={"ID":"2ac9c98f-3855-4940-bb9e-0b4b87831509","Type":"ContainerDied","Data":"b7f240c3082309541ac89bcc748ccc600b24b3c0ed37c1518efcf356e147d1c2"} Mar 19 19:20:15 crc kubenswrapper[5033]: I0319 19:20:15.093299 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" event={"ID":"2ac9c98f-3855-4940-bb9e-0b4b87831509","Type":"ContainerStarted","Data":"61cbb4ef32c113889ca7854e117d11062b5c4515f1d39a35cce1ed943e43166a"} Mar 19 19:20:15 crc kubenswrapper[5033]: I0319 19:20:15.094158 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:15 crc kubenswrapper[5033]: I0319 19:20:15.120817 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" podStartSLOduration=15.120797671 podStartE2EDuration="15.120797671s" podCreationTimestamp="2026-03-19 19:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:15.111772575 +0000 UTC m=+1425.216802424" watchObservedRunningTime="2026-03-19 19:20:15.120797671 +0000 UTC m=+1425.225827520" Mar 19 19:20:17 crc kubenswrapper[5033]: I0319 19:20:17.112058 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c343ed29-14b7-4363-a055-7b540ee2ea31","Type":"ContainerStarted","Data":"5776cf11bae670c14ac3464c38757525422fe5838015efeff8e7662a926eae1e"} Mar 19 19:20:17 crc kubenswrapper[5033]: I0319 19:20:17.114756 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f8dad24e-7338-41b9-b008-f3dd1c68d3de","Type":"ContainerStarted","Data":"500b581d4dd466e44fbab3963b3d152bbad6502ead543b1450b90f9d6594bc49"} Mar 19 19:20:17 crc kubenswrapper[5033]: I0319 19:20:17.118848 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" event={"ID":"f1c6e8dd-3762-416c-af0e-99322b5561c3","Type":"ContainerStarted","Data":"55d3e136266e06a108fd7dc086a680131e07976c1a126cae7b34a40d5deb59eb"} Mar 19 19:20:17 crc kubenswrapper[5033]: I0319 19:20:17.191309 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" podStartSLOduration=13.992866524 podStartE2EDuration="17.191286887s" podCreationTimestamp="2026-03-19 19:20:00 +0000 UTC" firstStartedPulling="2026-03-19 19:20:13.053679717 +0000 UTC m=+1423.158709556" lastFinishedPulling="2026-03-19 19:20:16.25210007 +0000 UTC m=+1426.357129919" observedRunningTime="2026-03-19 19:20:17.17000042 +0000 UTC m=+1427.275030269" watchObservedRunningTime="2026-03-19 19:20:17.191286887 +0000 UTC m=+1427.296316736" Mar 19 19:20:18 crc kubenswrapper[5033]: I0319 19:20:18.128469 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1c6e8dd-3762-416c-af0e-99322b5561c3" containerID="55d3e136266e06a108fd7dc086a680131e07976c1a126cae7b34a40d5deb59eb" exitCode=0 Mar 19 19:20:18 crc kubenswrapper[5033]: I0319 19:20:18.128544 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" event={"ID":"f1c6e8dd-3762-416c-af0e-99322b5561c3","Type":"ContainerDied","Data":"55d3e136266e06a108fd7dc086a680131e07976c1a126cae7b34a40d5deb59eb"} Mar 19 19:20:19 crc kubenswrapper[5033]: E0319 19:20:19.397325 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="2084df77-794b-44e4-92a5-16ccb442b1ee" Mar 19 19:20:19 crc kubenswrapper[5033]: I0319 19:20:19.869960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:19 crc kubenswrapper[5033]: I0319 19:20:19.946500 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6cgg\" (UniqueName: \"kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg\") pod \"f1c6e8dd-3762-416c-af0e-99322b5561c3\" (UID: \"f1c6e8dd-3762-416c-af0e-99322b5561c3\") " Mar 19 19:20:19 crc kubenswrapper[5033]: I0319 19:20:19.952779 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg" (OuterVolumeSpecName: "kube-api-access-v6cgg") pod "f1c6e8dd-3762-416c-af0e-99322b5561c3" (UID: "f1c6e8dd-3762-416c-af0e-99322b5561c3"). InnerVolumeSpecName "kube-api-access-v6cgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.049488 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6cgg\" (UniqueName: \"kubernetes.io/projected/f1c6e8dd-3762-416c-af0e-99322b5561c3-kube-api-access-v6cgg\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.186012 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2084df77-794b-44e4-92a5-16ccb442b1ee","Type":"ContainerStarted","Data":"0f75d1e7ea88072f6c07e608571384253a2a712429b671f905b181073fcd2f69"} Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.186505 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:20:20 crc kubenswrapper[5033]: E0319 19:20:20.189557 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="2084df77-794b-44e4-92a5-16ccb442b1ee" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.189781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" event={"ID":"f1c6e8dd-3762-416c-af0e-99322b5561c3","Type":"ContainerDied","Data":"14264f7d7e5dcb328ed4fd4a66afec123dc4321ff14a17830d69eb44aa250f0d"} Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.189818 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14264f7d7e5dcb328ed4fd4a66afec123dc4321ff14a17830d69eb44aa250f0d" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.189877 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-ctlbb" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.776788 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.862680 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.862897 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="dnsmasq-dns" containerID="cri-o://62c2fde7e847f78acffad4cfadc715fe289bcf13871bf818624f3d3a5558f81f" gracePeriod=10 Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.944532 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-qn7zn"] Mar 19 19:20:20 crc kubenswrapper[5033]: I0319 19:20:20.954827 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-qn7zn"] Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.049471 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-gvrd7"] Mar 19 19:20:21 crc kubenswrapper[5033]: E0319 19:20:21.049953 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c6e8dd-3762-416c-af0e-99322b5561c3" containerName="oc" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.049967 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c6e8dd-3762-416c-af0e-99322b5561c3" containerName="oc" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.050194 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c6e8dd-3762-416c-af0e-99322b5561c3" containerName="oc" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.051279 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.062414 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-gvrd7"] Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172172 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tcq\" (UniqueName: \"kubernetes.io/projected/cdc791a7-7319-491f-8a1a-bdcd8c333890-kube-api-access-c5tcq\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172243 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-config\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172361 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172385 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-svc\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172402 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172478 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.172504 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.202506 5033 generic.go:334] "Generic (PLEG): container finished" podID="c1caae62-86e1-4c11-8499-52cc408eb399" containerID="62c2fde7e847f78acffad4cfadc715fe289bcf13871bf818624f3d3a5558f81f" exitCode=0 Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.203563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" event={"ID":"c1caae62-86e1-4c11-8499-52cc408eb399","Type":"ContainerDied","Data":"62c2fde7e847f78acffad4cfadc715fe289bcf13871bf818624f3d3a5558f81f"} Mar 19 19:20:21 crc kubenswrapper[5033]: E0319 19:20:21.204830 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="2084df77-794b-44e4-92a5-16ccb442b1ee" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.274802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tcq\" (UniqueName: \"kubernetes.io/projected/cdc791a7-7319-491f-8a1a-bdcd8c333890-kube-api-access-c5tcq\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.274884 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-config\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.275009 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.275031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-svc\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.275061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.275095 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.275130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.277011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-config\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.279261 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-svc\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.279370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.279691 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.279858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.280010 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdc791a7-7319-491f-8a1a-bdcd8c333890-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.295042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tcq\" (UniqueName: \"kubernetes.io/projected/cdc791a7-7319-491f-8a1a-bdcd8c333890-kube-api-access-c5tcq\") pod \"dnsmasq-dns-85f64749dc-gvrd7\" (UID: \"cdc791a7-7319-491f-8a1a-bdcd8c333890\") " pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.370891 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:21 crc kubenswrapper[5033]: I0319 19:20:21.987731 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.065500 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-gvrd7"] Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.094149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7xtn\" (UniqueName: \"kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.094245 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.096607 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.096670 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.096742 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.096791 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc\") pod \"c1caae62-86e1-4c11-8499-52cc408eb399\" (UID: \"c1caae62-86e1-4c11-8499-52cc408eb399\") " Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.109735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn" (OuterVolumeSpecName: "kube-api-access-d7xtn") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "kube-api-access-d7xtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.202610 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.213386 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7xtn\" (UniqueName: \"kubernetes.io/projected/c1caae62-86e1-4c11-8499-52cc408eb399-kube-api-access-d7xtn\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.213428 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.230474 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config" (OuterVolumeSpecName: "config") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.230755 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.230893 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.230899 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1caae62-86e1-4c11-8499-52cc408eb399" (UID: "c1caae62-86e1-4c11-8499-52cc408eb399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.244154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" event={"ID":"c1caae62-86e1-4c11-8499-52cc408eb399","Type":"ContainerDied","Data":"252954f0eefb9e55dc0b4bb0d7bca5b5b90023315911b0643c98c86899e54645"} Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.244220 5033 scope.go:117] "RemoveContainer" containerID="62c2fde7e847f78acffad4cfadc715fe289bcf13871bf818624f3d3a5558f81f" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.244358 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-lcdt2" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.251774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" event={"ID":"cdc791a7-7319-491f-8a1a-bdcd8c333890","Type":"ContainerStarted","Data":"dd289a6e6adcfb2c73b994157d56c19e991d740b79e1ae64e28d65a8c1ef042f"} Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.299020 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.312022 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-lcdt2"] Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.314748 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.314771 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.314780 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.314788 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1caae62-86e1-4c11-8499-52cc408eb399-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.319522 5033 scope.go:117] "RemoveContainer" containerID="33dc49f190eca0bb83f35e5c3d3afaf29cf6ec8cc4e645137c029679bb30e71f" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.632335 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb876edd-c30e-4253-ac09-9db2e08dc2fc" path="/var/lib/kubelet/pods/bb876edd-c30e-4253-ac09-9db2e08dc2fc/volumes" Mar 19 19:20:22 crc kubenswrapper[5033]: I0319 19:20:22.633292 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" path="/var/lib/kubelet/pods/c1caae62-86e1-4c11-8499-52cc408eb399/volumes" Mar 19 19:20:23 crc kubenswrapper[5033]: I0319 19:20:23.264654 5033 generic.go:334] "Generic (PLEG): container finished" podID="cdc791a7-7319-491f-8a1a-bdcd8c333890" containerID="d0cdbd41d93b1d75f2bd8922509dea6aae376d58d7cfe466409d3d3dc8eb3277" exitCode=0 Mar 19 19:20:23 crc kubenswrapper[5033]: I0319 19:20:23.264695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" event={"ID":"cdc791a7-7319-491f-8a1a-bdcd8c333890","Type":"ContainerDied","Data":"d0cdbd41d93b1d75f2bd8922509dea6aae376d58d7cfe466409d3d3dc8eb3277"} Mar 19 19:20:24 crc kubenswrapper[5033]: I0319 19:20:24.276597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" event={"ID":"cdc791a7-7319-491f-8a1a-bdcd8c333890","Type":"ContainerStarted","Data":"ec89e4b1a7c7df073eab5eaa27d4beac299706f085ee4a1133c6c6f1f7561433"} Mar 19 19:20:24 crc kubenswrapper[5033]: I0319 19:20:24.277179 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:24 crc kubenswrapper[5033]: I0319 19:20:24.306009 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" podStartSLOduration=3.3059926060000002 podStartE2EDuration="3.305992606s" podCreationTimestamp="2026-03-19 19:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:24.295240959 +0000 UTC m=+1434.400270808" watchObservedRunningTime="2026-03-19 19:20:24.305992606 +0000 UTC m=+1434.411022455" Mar 19 19:20:28 crc kubenswrapper[5033]: I0319 19:20:28.328178 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9lpbv" event={"ID":"9fb5bcaa-3619-4584-b125-1f3d521ffb2c","Type":"ContainerStarted","Data":"e0551935d2c9102ad1b47d80cf4386c012b279edd80bea174fcb2991f8ae8c9b"} Mar 19 19:20:28 crc kubenswrapper[5033]: I0319 19:20:28.347101 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-9lpbv" podStartSLOduration=1.56496347 podStartE2EDuration="44.347065434s" podCreationTimestamp="2026-03-19 19:19:44 +0000 UTC" firstStartedPulling="2026-03-19 19:19:45.015606745 +0000 UTC m=+1395.120636594" lastFinishedPulling="2026-03-19 19:20:27.797708709 +0000 UTC m=+1437.902738558" observedRunningTime="2026-03-19 19:20:28.344711377 +0000 UTC m=+1438.449741246" watchObservedRunningTime="2026-03-19 19:20:28.347065434 +0000 UTC m=+1438.452095293" Mar 19 19:20:30 crc kubenswrapper[5033]: I0319 19:20:30.348436 5033 generic.go:334] "Generic (PLEG): container finished" podID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" containerID="e0551935d2c9102ad1b47d80cf4386c012b279edd80bea174fcb2991f8ae8c9b" exitCode=0 Mar 19 19:20:30 crc kubenswrapper[5033]: I0319 19:20:30.348547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9lpbv" event={"ID":"9fb5bcaa-3619-4584-b125-1f3d521ffb2c","Type":"ContainerDied","Data":"e0551935d2c9102ad1b47d80cf4386c012b279edd80bea174fcb2991f8ae8c9b"} Mar 19 19:20:31 crc kubenswrapper[5033]: I0319 19:20:31.373674 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-gvrd7" Mar 19 19:20:31 crc kubenswrapper[5033]: I0319 19:20:31.478095 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:31 crc kubenswrapper[5033]: I0319 19:20:31.478354 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="dnsmasq-dns" containerID="cri-o://61cbb4ef32c113889ca7854e117d11062b5c4515f1d39a35cce1ed943e43166a" gracePeriod=10 Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.368133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9lpbv" event={"ID":"9fb5bcaa-3619-4584-b125-1f3d521ffb2c","Type":"ContainerDied","Data":"4cb0e36e658673ca4598a9e762325d22f725b9015e2de1942b521f37558a4da3"} Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.368379 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb0e36e658673ca4598a9e762325d22f725b9015e2de1942b521f37558a4da3" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.370078 5033 generic.go:334] "Generic (PLEG): container finished" podID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerID="61cbb4ef32c113889ca7854e117d11062b5c4515f1d39a35cce1ed943e43166a" exitCode=0 Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.370120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" event={"ID":"2ac9c98f-3855-4940-bb9e-0b4b87831509","Type":"ContainerDied","Data":"61cbb4ef32c113889ca7854e117d11062b5c4515f1d39a35cce1ed943e43166a"} Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.501227 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.510625 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.545724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7vs2\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2\") pod \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.545792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data\") pod \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.545926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs\") pod \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.545959 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts\") pod \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.546039 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle\") pod \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\" (UID: \"9fb5bcaa-3619-4584-b125-1f3d521ffb2c\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.556572 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts" (OuterVolumeSpecName: "scripts") pod "9fb5bcaa-3619-4584-b125-1f3d521ffb2c" (UID: "9fb5bcaa-3619-4584-b125-1f3d521ffb2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.575029 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs" (OuterVolumeSpecName: "certs") pod "9fb5bcaa-3619-4584-b125-1f3d521ffb2c" (UID: "9fb5bcaa-3619-4584-b125-1f3d521ffb2c"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.581709 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2" (OuterVolumeSpecName: "kube-api-access-w7vs2") pod "9fb5bcaa-3619-4584-b125-1f3d521ffb2c" (UID: "9fb5bcaa-3619-4584-b125-1f3d521ffb2c"). InnerVolumeSpecName "kube-api-access-w7vs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.586512 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fb5bcaa-3619-4584-b125-1f3d521ffb2c" (UID: "9fb5bcaa-3619-4584-b125-1f3d521ffb2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.622512 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data" (OuterVolumeSpecName: "config-data") pod "9fb5bcaa-3619-4584-b125-1f3d521ffb2c" (UID: "9fb5bcaa-3619-4584-b125-1f3d521ffb2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.647802 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff2jr\" (UniqueName: \"kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.647856 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.647997 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648025 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648065 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648094 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648144 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config\") pod \"2ac9c98f-3855-4940-bb9e-0b4b87831509\" (UID: \"2ac9c98f-3855-4940-bb9e-0b4b87831509\") " Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648750 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7vs2\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-kube-api-access-w7vs2\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648765 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648774 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648785 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.648794 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb5bcaa-3619-4584-b125-1f3d521ffb2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.670842 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr" (OuterVolumeSpecName: "kube-api-access-ff2jr") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "kube-api-access-ff2jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.710339 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.753298 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.753339 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff2jr\" (UniqueName: \"kubernetes.io/projected/2ac9c98f-3855-4940-bb9e-0b4b87831509-kube-api-access-ff2jr\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.760133 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.777994 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.781618 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.795051 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config" (OuterVolumeSpecName: "config") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.821932 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ac9c98f-3855-4940-bb9e-0b4b87831509" (UID: "2ac9c98f-3855-4940-bb9e-0b4b87831509"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.855195 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.855233 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.855243 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.855251 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[5033]: I0319 19:20:32.855260 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac9c98f-3855-4940-bb9e-0b4b87831509-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.381813 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" event={"ID":"2ac9c98f-3855-4940-bb9e-0b4b87831509","Type":"ContainerDied","Data":"98cac2981a1793e7333f1667bc3832704cf5d19f3d6ab41a30c61f242388a42a"} Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.381856 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-6jx7j" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.381886 5033 scope.go:117] "RemoveContainer" containerID="61cbb4ef32c113889ca7854e117d11062b5c4515f1d39a35cce1ed943e43166a" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.381856 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9lpbv" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.414550 5033 scope.go:117] "RemoveContainer" containerID="b7f240c3082309541ac89bcc748ccc600b24b3c0ed37c1518efcf356e147d1c2" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.442505 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.453275 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-6jx7j"] Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.688221 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-dv4ng"] Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.697675 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-dv4ng"] Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.783607 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-c4ql9"] Mar 19 19:20:33 crc kubenswrapper[5033]: E0319 19:20:33.784113 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="init" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784130 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="init" Mar 19 19:20:33 crc kubenswrapper[5033]: E0319 19:20:33.784161 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="init" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784169 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="init" Mar 19 19:20:33 crc kubenswrapper[5033]: E0319 19:20:33.784180 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784186 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: E0319 19:20:33.784196 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" containerName="cloudkitty-db-sync" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" containerName="cloudkitty-db-sync" Mar 19 19:20:33 crc kubenswrapper[5033]: E0319 19:20:33.784217 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784222 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784400 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1caae62-86e1-4c11-8499-52cc408eb399" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784418 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" containerName="dnsmasq-dns" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.784434 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" containerName="cloudkitty-db-sync" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.785156 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.790557 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.799816 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c4ql9"] Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.875625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.875676 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.875700 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.875997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.876226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnh27\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.977689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.977744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.977768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.977823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.977877 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnh27\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.983400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.985407 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.986016 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:33 crc kubenswrapper[5033]: I0319 19:20:33.996870 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.002111 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnh27\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27\") pod \"cloudkitty-storageinit-c4ql9\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.109417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.587702 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c4ql9"] Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.642254 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e68c82-8a51-4824-a4ae-211e63d05144" path="/var/lib/kubelet/pods/02e68c82-8a51-4824-a4ae-211e63d05144/volumes" Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.643123 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac9c98f-3855-4940-bb9e-0b4b87831509" path="/var/lib/kubelet/pods/2ac9c98f-3855-4940-bb9e-0b4b87831509/volumes" Mar 19 19:20:34 crc kubenswrapper[5033]: I0319 19:20:34.643913 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:20:35 crc kubenswrapper[5033]: I0319 19:20:35.404230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c4ql9" event={"ID":"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a","Type":"ContainerStarted","Data":"1a790ba1233f7377994be247c27ae8f28531ed4c328257919d038a2548b48a1b"} Mar 19 19:20:35 crc kubenswrapper[5033]: I0319 19:20:35.404781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c4ql9" event={"ID":"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a","Type":"ContainerStarted","Data":"aac0015624c431eb73f806046cb308c1430106af3fa4c0efe759d0eefdd468d6"} Mar 19 19:20:35 crc kubenswrapper[5033]: I0319 19:20:35.409169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2084df77-794b-44e4-92a5-16ccb442b1ee","Type":"ContainerStarted","Data":"5499e2ead1f168c504d502d670280cb2931b94815871ddd365ef15c497180ace"} Mar 19 19:20:35 crc kubenswrapper[5033]: I0319 19:20:35.421724 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-c4ql9" podStartSLOduration=2.42170875 podStartE2EDuration="2.42170875s" podCreationTimestamp="2026-03-19 19:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:35.416353568 +0000 UTC m=+1445.521383417" watchObservedRunningTime="2026-03-19 19:20:35.42170875 +0000 UTC m=+1445.526738590" Mar 19 19:20:35 crc kubenswrapper[5033]: I0319 19:20:35.439976 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.217316221 podStartE2EDuration="45.439953381s" podCreationTimestamp="2026-03-19 19:19:50 +0000 UTC" firstStartedPulling="2026-03-19 19:19:51.588991034 +0000 UTC m=+1401.694020883" lastFinishedPulling="2026-03-19 19:20:34.811628194 +0000 UTC m=+1444.916658043" observedRunningTime="2026-03-19 19:20:35.437089589 +0000 UTC m=+1445.542119448" watchObservedRunningTime="2026-03-19 19:20:35.439953381 +0000 UTC m=+1445.544983230" Mar 19 19:20:37 crc kubenswrapper[5033]: I0319 19:20:37.431573 5033 generic.go:334] "Generic (PLEG): container finished" podID="53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" containerID="1a790ba1233f7377994be247c27ae8f28531ed4c328257919d038a2548b48a1b" exitCode=0 Mar 19 19:20:37 crc kubenswrapper[5033]: I0319 19:20:37.431641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c4ql9" event={"ID":"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a","Type":"ContainerDied","Data":"1a790ba1233f7377994be247c27ae8f28531ed4c328257919d038a2548b48a1b"} Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.234340 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.285822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs\") pod \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.285870 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle\") pod \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.285911 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnh27\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27\") pod \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.286046 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data\") pod \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.286149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts\") pod \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\" (UID: \"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a\") " Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.294839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27" (OuterVolumeSpecName: "kube-api-access-dnh27") pod "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" (UID: "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a"). InnerVolumeSpecName "kube-api-access-dnh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.310384 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts" (OuterVolumeSpecName: "scripts") pod "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" (UID: "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.318244 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" (UID: "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.319601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs" (OuterVolumeSpecName: "certs") pod "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" (UID: "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.324093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data" (OuterVolumeSpecName: "config-data") pod "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" (UID: "53f861aa-a778-4f3d-bc7f-e4b74b10ca6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.388881 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.388918 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.388935 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnh27\" (UniqueName: \"kubernetes.io/projected/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-kube-api-access-dnh27\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.388949 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.388960 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.453220 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c4ql9" event={"ID":"53f861aa-a778-4f3d-bc7f-e4b74b10ca6a","Type":"ContainerDied","Data":"aac0015624c431eb73f806046cb308c1430106af3fa4c0efe759d0eefdd468d6"} Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.453259 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac0015624c431eb73f806046cb308c1430106af3fa4c0efe759d0eefdd468d6" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.453283 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c4ql9" Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.566419 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.566738 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" containerName="cloudkitty-proc" containerID="cri-o://72f15a7ac2c4129a38ffe07a8d6244a8da70fb44210764b9531ef198c1406225" gracePeriod=30 Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.584086 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.584318 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api-log" containerID="cri-o://0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64" gracePeriod=30 Mar 19 19:20:39 crc kubenswrapper[5033]: I0319 19:20:39.584440 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api" containerID="cri-o://f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b" gracePeriod=30 Mar 19 19:20:40 crc kubenswrapper[5033]: I0319 19:20:40.476840 5033 generic.go:334] "Generic (PLEG): container finished" podID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" containerID="72f15a7ac2c4129a38ffe07a8d6244a8da70fb44210764b9531ef198c1406225" exitCode=0 Mar 19 19:20:40 crc kubenswrapper[5033]: I0319 19:20:40.477333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1d668e98-8678-40bf-8c6c-f44f1937c5a0","Type":"ContainerDied","Data":"72f15a7ac2c4129a38ffe07a8d6244a8da70fb44210764b9531ef198c1406225"} Mar 19 19:20:40 crc kubenswrapper[5033]: I0319 19:20:40.482354 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerID="0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64" exitCode=143 Mar 19 19:20:40 crc kubenswrapper[5033]: I0319 19:20:40.482394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerDied","Data":"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64"} Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.234729 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350295 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350328 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350396 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350551 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmcrs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350584 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350611 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350652 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350718 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs" (OuterVolumeSpecName: "logs") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.350743 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs\") pod \"b1d4b7ba-05b1-424e-9635-2d2177381d21\" (UID: \"b1d4b7ba-05b1-424e-9635-2d2177381d21\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.351856 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d4b7ba-05b1-424e-9635-2d2177381d21-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.357746 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.362776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs" (OuterVolumeSpecName: "certs") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.367701 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs" (OuterVolumeSpecName: "kube-api-access-gmcrs") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "kube-api-access-gmcrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.372811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts" (OuterVolumeSpecName: "scripts") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.399753 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data" (OuterVolumeSpecName: "config-data") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.430093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453331 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453359 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453371 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmcrs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-kube-api-access-gmcrs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453379 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453388 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453396 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b1d4b7ba-05b1-424e-9635-2d2177381d21-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.453395 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.458845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b1d4b7ba-05b1-424e-9635-2d2177381d21" (UID: "b1d4b7ba-05b1-424e-9635-2d2177381d21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.501994 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerID="f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b" exitCode=0 Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.502074 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.502084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerDied","Data":"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b"} Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.502155 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b1d4b7ba-05b1-424e-9635-2d2177381d21","Type":"ContainerDied","Data":"9f954ce971d568130875ae50f9ec4965ce90353c0c93ff35b293c94c195d24c0"} Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.502176 5033 scope.go:117] "RemoveContainer" containerID="f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.507070 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"1d668e98-8678-40bf-8c6c-f44f1937c5a0","Type":"ContainerDied","Data":"1e202adfc8c92f6f20e9e9e635f0b2b68438541109bf1a5fde9cf80635288087"} Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.507101 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e202adfc8c92f6f20e9e9e635f0b2b68438541109bf1a5fde9cf80635288087" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.551120 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.561046 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.561086 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d4b7ba-05b1-424e-9635-2d2177381d21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.580120 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.580665 5033 scope.go:117] "RemoveContainer" containerID="0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.613278 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.616804 5033 scope.go:117] "RemoveContainer" containerID="f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b" Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.618611 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b\": container with ID starting with f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b not found: ID does not exist" containerID="f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.618661 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b"} err="failed to get container status \"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b\": rpc error: code = NotFound desc = could not find container \"f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b\": container with ID starting with f7d01f3bd88e2d7baea2233b736cee64cf3711612d8a6f2b20f882058b733b8b not found: ID does not exist" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.618691 5033 scope.go:117] "RemoveContainer" containerID="0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64" Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.619351 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64\": container with ID starting with 0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64 not found: ID does not exist" containerID="0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.619375 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64"} err="failed to get container status \"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64\": rpc error: code = NotFound desc = could not find container \"0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64\": container with ID starting with 0756d5d7ba68cb95e3024f994ace86800cc50d88a0af8bb08074918e580e6c64 not found: ID does not exist" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.641388 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.643370 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api-log" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643398 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api-log" Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.643425 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" containerName="cloudkitty-storageinit" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643434 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" containerName="cloudkitty-storageinit" Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.643468 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643478 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api" Mar 19 19:20:41 crc kubenswrapper[5033]: E0319 19:20:41.643493 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" containerName="cloudkitty-proc" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643502 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" containerName="cloudkitty-proc" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643807 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" containerName="cloudkitty-proc" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643829 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api-log" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643851 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" containerName="cloudkitty-api" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.643872 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" containerName="cloudkitty-storageinit" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.645776 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.647871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.648035 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.648270 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.654730 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.662656 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.662780 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.662864 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckczf\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.663081 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.663192 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.663242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom\") pod \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\" (UID: \"1d668e98-8678-40bf-8c6c-f44f1937c5a0\") " Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.665468 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs" (OuterVolumeSpecName: "certs") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.666126 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts" (OuterVolumeSpecName: "scripts") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.666956 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.668942 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf" (OuterVolumeSpecName: "kube-api-access-ckczf") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "kube-api-access-ckczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.692374 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.703601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data" (OuterVolumeSpecName: "config-data") pod "1d668e98-8678-40bf-8c6c-f44f1937c5a0" (UID: "1d668e98-8678-40bf-8c6c-f44f1937c5a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765411 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765547 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdnn\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-kube-api-access-8tdnn\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-scripts\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-logs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765688 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765790 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765851 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765863 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765897 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765907 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765915 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d668e98-8678-40bf-8c6c-f44f1937c5a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.765925 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckczf\" (UniqueName: \"kubernetes.io/projected/1d668e98-8678-40bf-8c6c-f44f1937c5a0-kube-api-access-ckczf\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.871538 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.871719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872364 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872537 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872591 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdnn\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-kube-api-access-8tdnn\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-scripts\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872778 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-logs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.872852 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.873442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-logs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.875359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.875876 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.875917 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-scripts\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.877756 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.878207 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.878417 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-config-data\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.880857 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.893370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdnn\" (UniqueName: \"kubernetes.io/projected/2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0-kube-api-access-8tdnn\") pod \"cloudkitty-api-0\" (UID: \"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0\") " pod="openstack/cloudkitty-api-0" Mar 19 19:20:41 crc kubenswrapper[5033]: I0319 19:20:41.961421 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.474597 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.522321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0","Type":"ContainerStarted","Data":"b84df8bb33dcccbf9ae5d04bcfd4ed09f446754723b42881a9e7aaa00aa3ab64"} Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.523812 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.599189 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.616221 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.641755 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d668e98-8678-40bf-8c6c-f44f1937c5a0" path="/var/lib/kubelet/pods/1d668e98-8678-40bf-8c6c-f44f1937c5a0/volumes" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.642345 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d4b7ba-05b1-424e-9635-2d2177381d21" path="/var/lib/kubelet/pods/b1d4b7ba-05b1-424e-9635-2d2177381d21/volumes" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.642938 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.644221 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.644319 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.646232 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.810706 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.810761 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpc4d\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-kube-api-access-cpc4d\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.811179 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.811262 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.811335 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-certs\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.811404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913248 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpc4d\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-kube-api-access-cpc4d\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-certs\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.913804 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.917297 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.917488 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-certs\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.917763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-scripts\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.917944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-config-data\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.918584 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0222c-e676-4f3c-8930-6926d7824866-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.932118 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpc4d\" (UniqueName: \"kubernetes.io/projected/a1b0222c-e676-4f3c-8930-6926d7824866-kube-api-access-cpc4d\") pod \"cloudkitty-proc-0\" (UID: \"a1b0222c-e676-4f3c-8930-6926d7824866\") " pod="openstack/cloudkitty-proc-0" Mar 19 19:20:42 crc kubenswrapper[5033]: I0319 19:20:42.978064 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.429956 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.533997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0","Type":"ContainerStarted","Data":"03ff77a01bd92444613d8df8efff240cd21b0b14263f00612733b62dea92b271"} Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.534038 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0","Type":"ContainerStarted","Data":"deddc8ce5ed0fcb1f05d6e28c956284b69294a223aae7efd8e4a5118248a45ac"} Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.534208 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.535421 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a1b0222c-e676-4f3c-8930-6926d7824866","Type":"ContainerStarted","Data":"1e60b469781318527c36ae38c7eb501ecb33d3d0cd92700c645d8ec20d0cf8cf"} Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.559099 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.55907787 podStartE2EDuration="2.55907787s" podCreationTimestamp="2026-03-19 19:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:43.558708049 +0000 UTC m=+1453.663737898" watchObservedRunningTime="2026-03-19 19:20:43.55907787 +0000 UTC m=+1453.664107719" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.854998 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f"] Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.856553 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.858817 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.859004 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.860413 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.860604 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:20:43 crc kubenswrapper[5033]: I0319 19:20:43.875176 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f"] Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.034499 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.034827 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzn6\" (UniqueName: \"kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.034941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.034966 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.136507 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.137632 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzn6\" (UniqueName: \"kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.138041 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.138108 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.143607 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.143731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.144254 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.154129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzn6\" (UniqueName: \"kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:44 crc kubenswrapper[5033]: I0319 19:20:44.175541 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:20:46 crc kubenswrapper[5033]: I0319 19:20:46.324226 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f"] Mar 19 19:20:47 crc kubenswrapper[5033]: I0319 19:20:47.194937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" event={"ID":"473215a7-171f-46db-ad01-632b59a1eb95","Type":"ContainerStarted","Data":"cb54658cda0964c62a128d2dab0c0c2a20217e7a96ca6ccf6f456041e0420c9e"} Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.208500 5033 generic.go:334] "Generic (PLEG): container finished" podID="c343ed29-14b7-4363-a055-7b540ee2ea31" containerID="5776cf11bae670c14ac3464c38757525422fe5838015efeff8e7662a926eae1e" exitCode=0 Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.208584 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c343ed29-14b7-4363-a055-7b540ee2ea31","Type":"ContainerDied","Data":"5776cf11bae670c14ac3464c38757525422fe5838015efeff8e7662a926eae1e"} Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.212357 5033 generic.go:334] "Generic (PLEG): container finished" podID="f8dad24e-7338-41b9-b008-f3dd1c68d3de" containerID="500b581d4dd466e44fbab3963b3d152bbad6502ead543b1450b90f9d6594bc49" exitCode=0 Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.212463 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f8dad24e-7338-41b9-b008-f3dd1c68d3de","Type":"ContainerDied","Data":"500b581d4dd466e44fbab3963b3d152bbad6502ead543b1450b90f9d6594bc49"} Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.214232 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"a1b0222c-e676-4f3c-8930-6926d7824866","Type":"ContainerStarted","Data":"951714b217151ac2555c71207d4c6b102e7a45f9cc427f3e89fd91d75cfde19b"} Mar 19 19:20:48 crc kubenswrapper[5033]: I0319 19:20:48.248850 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.325634797 podStartE2EDuration="6.248834494s" podCreationTimestamp="2026-03-19 19:20:42 +0000 UTC" firstStartedPulling="2026-03-19 19:20:43.433408536 +0000 UTC m=+1453.538438385" lastFinishedPulling="2026-03-19 19:20:47.356608223 +0000 UTC m=+1457.461638082" observedRunningTime="2026-03-19 19:20:48.246382334 +0000 UTC m=+1458.351412183" watchObservedRunningTime="2026-03-19 19:20:48.248834494 +0000 UTC m=+1458.353864343" Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.234667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c343ed29-14b7-4363-a055-7b540ee2ea31","Type":"ContainerStarted","Data":"181be660eb557ed542d6bf781fb6b9d8c8873db99b87cab761abb15aed3e454f"} Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.235169 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.244496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f8dad24e-7338-41b9-b008-f3dd1c68d3de","Type":"ContainerStarted","Data":"52690c357d882ba5764c7bd534fa9ffd6aac58e661cb629fecf15f248661ab20"} Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.244939 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.270004 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.269989851 podStartE2EDuration="48.269989851s" podCreationTimestamp="2026-03-19 19:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:49.261297473 +0000 UTC m=+1459.366327322" watchObservedRunningTime="2026-03-19 19:20:49.269989851 +0000 UTC m=+1459.375019700" Mar 19 19:20:49 crc kubenswrapper[5033]: I0319 19:20:49.292703 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.292686708 podStartE2EDuration="43.292686708s" podCreationTimestamp="2026-03-19 19:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:49.289010513 +0000 UTC m=+1459.394040372" watchObservedRunningTime="2026-03-19 19:20:49.292686708 +0000 UTC m=+1459.397716557" Mar 19 19:20:57 crc kubenswrapper[5033]: I0319 19:20:57.327901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" event={"ID":"473215a7-171f-46db-ad01-632b59a1eb95","Type":"ContainerStarted","Data":"5c1ecbe4710e4615a5470daa3feb17d70bf314cb2a4cc377ff6a339a5e249f91"} Mar 19 19:20:57 crc kubenswrapper[5033]: I0319 19:20:57.346708 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" podStartSLOduration=3.680589187 podStartE2EDuration="14.34668861s" podCreationTimestamp="2026-03-19 19:20:43 +0000 UTC" firstStartedPulling="2026-03-19 19:20:46.321295452 +0000 UTC m=+1456.426325301" lastFinishedPulling="2026-03-19 19:20:56.987394875 +0000 UTC m=+1467.092424724" observedRunningTime="2026-03-19 19:20:57.341408949 +0000 UTC m=+1467.446438798" watchObservedRunningTime="2026-03-19 19:20:57.34668861 +0000 UTC m=+1467.451718459" Mar 19 19:21:02 crc kubenswrapper[5033]: I0319 19:21:02.316701 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 19:21:06 crc kubenswrapper[5033]: I0319 19:21:06.968643 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:21:08 crc kubenswrapper[5033]: I0319 19:21:08.445582 5033 generic.go:334] "Generic (PLEG): container finished" podID="473215a7-171f-46db-ad01-632b59a1eb95" containerID="5c1ecbe4710e4615a5470daa3feb17d70bf314cb2a4cc377ff6a339a5e249f91" exitCode=0 Mar 19 19:21:08 crc kubenswrapper[5033]: I0319 19:21:08.445629 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" event={"ID":"473215a7-171f-46db-ad01-632b59a1eb95","Type":"ContainerDied","Data":"5c1ecbe4710e4615a5470daa3feb17d70bf314cb2a4cc377ff6a339a5e249f91"} Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.076188 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.208415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzn6\" (UniqueName: \"kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6\") pod \"473215a7-171f-46db-ad01-632b59a1eb95\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.209747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory\") pod \"473215a7-171f-46db-ad01-632b59a1eb95\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.210284 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam\") pod \"473215a7-171f-46db-ad01-632b59a1eb95\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.210554 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle\") pod \"473215a7-171f-46db-ad01-632b59a1eb95\" (UID: \"473215a7-171f-46db-ad01-632b59a1eb95\") " Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.215476 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "473215a7-171f-46db-ad01-632b59a1eb95" (UID: "473215a7-171f-46db-ad01-632b59a1eb95"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.216661 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6" (OuterVolumeSpecName: "kube-api-access-gzzn6") pod "473215a7-171f-46db-ad01-632b59a1eb95" (UID: "473215a7-171f-46db-ad01-632b59a1eb95"). InnerVolumeSpecName "kube-api-access-gzzn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.242663 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "473215a7-171f-46db-ad01-632b59a1eb95" (UID: "473215a7-171f-46db-ad01-632b59a1eb95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.255887 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory" (OuterVolumeSpecName: "inventory") pod "473215a7-171f-46db-ad01-632b59a1eb95" (UID: "473215a7-171f-46db-ad01-632b59a1eb95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.313944 5033 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.313977 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzn6\" (UniqueName: \"kubernetes.io/projected/473215a7-171f-46db-ad01-632b59a1eb95-kube-api-access-gzzn6\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.313989 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.313997 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/473215a7-171f-46db-ad01-632b59a1eb95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.466021 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" event={"ID":"473215a7-171f-46db-ad01-632b59a1eb95","Type":"ContainerDied","Data":"cb54658cda0964c62a128d2dab0c0c2a20217e7a96ca6ccf6f456041e0420c9e"} Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.466060 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb54658cda0964c62a128d2dab0c0c2a20217e7a96ca6ccf6f456041e0420c9e" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.466068 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.572119 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79"] Mar 19 19:21:10 crc kubenswrapper[5033]: E0319 19:21:10.572807 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473215a7-171f-46db-ad01-632b59a1eb95" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.572827 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="473215a7-171f-46db-ad01-632b59a1eb95" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.573068 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="473215a7-171f-46db-ad01-632b59a1eb95" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.573779 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.580313 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.580541 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.580651 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.580774 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.601422 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79"] Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.726509 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.726561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb68\" (UniqueName: \"kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.726865 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.759140 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.759205 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.829609 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.829665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flb68\" (UniqueName: \"kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.829820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.833275 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.834323 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.847725 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flb68\" (UniqueName: \"kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x4k79\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:10 crc kubenswrapper[5033]: I0319 19:21:10.904083 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:11 crc kubenswrapper[5033]: I0319 19:21:11.528844 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79"] Mar 19 19:21:11 crc kubenswrapper[5033]: W0319 19:21:11.534998 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf166f58a_6a51_48ba_ae2e_aa90d8a656dc.slice/crio-d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260 WatchSource:0}: Error finding container d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260: Status 404 returned error can't find the container with id d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260 Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.400142 5033 scope.go:117] "RemoveContainer" containerID="eb98b9e36ce3a9f4c043b8fd90f8689773a1f355631d734c1ea657fd8ff89f9d" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.440214 5033 scope.go:117] "RemoveContainer" containerID="675948172a31265004e1481053dad14f758e5626aad4d5037b068fef23e7507d" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.485515 5033 scope.go:117] "RemoveContainer" containerID="6ff92545eafd2e4e600cd62655455ed51b6d22bb6c42368a7e1484fd827133c8" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.486600 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" event={"ID":"f166f58a-6a51-48ba-ae2e-aa90d8a656dc","Type":"ContainerStarted","Data":"207e44c99f4ad89740f6563c27435c8bd9eb18d8c76da6b0ac6af9bd2fb64f2f"} Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.486630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" event={"ID":"f166f58a-6a51-48ba-ae2e-aa90d8a656dc","Type":"ContainerStarted","Data":"d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260"} Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.505369 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" podStartSLOduration=2.057664599 podStartE2EDuration="2.505350875s" podCreationTimestamp="2026-03-19 19:21:10 +0000 UTC" firstStartedPulling="2026-03-19 19:21:11.542053807 +0000 UTC m=+1481.647083656" lastFinishedPulling="2026-03-19 19:21:11.989740083 +0000 UTC m=+1482.094769932" observedRunningTime="2026-03-19 19:21:12.50133721 +0000 UTC m=+1482.606367059" watchObservedRunningTime="2026-03-19 19:21:12.505350875 +0000 UTC m=+1482.610380724" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.544286 5033 scope.go:117] "RemoveContainer" containerID="8ccb34d2c903bbca2ba09f3303239e9f3a44fc474f9a19c5f0b8409a8ea5ffb4" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.571982 5033 scope.go:117] "RemoveContainer" containerID="7c07c80951690c8f39ec1c80441ae37f6fbc5de0ce477c5c9090a5455c4dc465" Mar 19 19:21:12 crc kubenswrapper[5033]: I0319 19:21:12.592971 5033 scope.go:117] "RemoveContainer" containerID="804e3258f0d1393788163824af8c99f2f5885b99ce19f68906b36322b81fe8ed" Mar 19 19:21:15 crc kubenswrapper[5033]: I0319 19:21:15.520093 5033 generic.go:334] "Generic (PLEG): container finished" podID="f166f58a-6a51-48ba-ae2e-aa90d8a656dc" containerID="207e44c99f4ad89740f6563c27435c8bd9eb18d8c76da6b0ac6af9bd2fb64f2f" exitCode=0 Mar 19 19:21:15 crc kubenswrapper[5033]: I0319 19:21:15.520352 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" event={"ID":"f166f58a-6a51-48ba-ae2e-aa90d8a656dc","Type":"ContainerDied","Data":"207e44c99f4ad89740f6563c27435c8bd9eb18d8c76da6b0ac6af9bd2fb64f2f"} Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.179027 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.293032 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory\") pod \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.293147 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flb68\" (UniqueName: \"kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68\") pod \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.293186 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam\") pod \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\" (UID: \"f166f58a-6a51-48ba-ae2e-aa90d8a656dc\") " Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.300630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68" (OuterVolumeSpecName: "kube-api-access-flb68") pod "f166f58a-6a51-48ba-ae2e-aa90d8a656dc" (UID: "f166f58a-6a51-48ba-ae2e-aa90d8a656dc"). InnerVolumeSpecName "kube-api-access-flb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.342579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f166f58a-6a51-48ba-ae2e-aa90d8a656dc" (UID: "f166f58a-6a51-48ba-ae2e-aa90d8a656dc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.356647 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory" (OuterVolumeSpecName: "inventory") pod "f166f58a-6a51-48ba-ae2e-aa90d8a656dc" (UID: "f166f58a-6a51-48ba-ae2e-aa90d8a656dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.398847 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.398882 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flb68\" (UniqueName: \"kubernetes.io/projected/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-kube-api-access-flb68\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.398895 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f166f58a-6a51-48ba-ae2e-aa90d8a656dc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.539258 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" event={"ID":"f166f58a-6a51-48ba-ae2e-aa90d8a656dc","Type":"ContainerDied","Data":"d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260"} Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.539574 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d387d97f12bcf7927e3fd5bb059aaba529585e75237f74ed011d1d0df07aa260" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.539316 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x4k79" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.606911 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf"] Mar 19 19:21:17 crc kubenswrapper[5033]: E0319 19:21:17.607406 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f166f58a-6a51-48ba-ae2e-aa90d8a656dc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.607429 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f166f58a-6a51-48ba-ae2e-aa90d8a656dc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.607643 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f166f58a-6a51-48ba-ae2e-aa90d8a656dc" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.608326 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.611181 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.611242 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.611749 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.611903 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.625242 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf"] Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.805659 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vvrx\" (UniqueName: \"kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.805810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.806017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.806604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.907947 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.908018 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.908086 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.908141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vvrx\" (UniqueName: \"kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.916026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.918411 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.919141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:17 crc kubenswrapper[5033]: I0319 19:21:17.927166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vvrx\" (UniqueName: \"kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-snswf\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:18 crc kubenswrapper[5033]: I0319 19:21:18.225726 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:21:18 crc kubenswrapper[5033]: I0319 19:21:18.792772 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf"] Mar 19 19:21:18 crc kubenswrapper[5033]: W0319 19:21:18.810939 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78f1dc7a_f38b_4b03_ad2f_3cc0fd7d7803.slice/crio-e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b WatchSource:0}: Error finding container e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b: Status 404 returned error can't find the container with id e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b Mar 19 19:21:19 crc kubenswrapper[5033]: I0319 19:21:19.086338 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 19 19:21:19 crc kubenswrapper[5033]: I0319 19:21:19.564976 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" event={"ID":"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803","Type":"ContainerStarted","Data":"e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b"} Mar 19 19:21:20 crc kubenswrapper[5033]: I0319 19:21:20.594980 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" event={"ID":"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803","Type":"ContainerStarted","Data":"6eb4f30d4f84cbec07f7ba3d0a43ddfbaaa26f6d3b07bd647fbdf54ba091170b"} Mar 19 19:21:20 crc kubenswrapper[5033]: I0319 19:21:20.618374 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" podStartSLOduration=2.893041397 podStartE2EDuration="3.618352229s" podCreationTimestamp="2026-03-19 19:21:17 +0000 UTC" firstStartedPulling="2026-03-19 19:21:18.815623436 +0000 UTC m=+1488.920653275" lastFinishedPulling="2026-03-19 19:21:19.540934258 +0000 UTC m=+1489.645964107" observedRunningTime="2026-03-19 19:21:20.608637772 +0000 UTC m=+1490.713667641" watchObservedRunningTime="2026-03-19 19:21:20.618352229 +0000 UTC m=+1490.723382078" Mar 19 19:21:40 crc kubenswrapper[5033]: I0319 19:21:40.759158 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:21:40 crc kubenswrapper[5033]: I0319 19:21:40.759790 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.140291 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565802-mzjpj"] Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.142617 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.145592 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.145996 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.152216 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.155512 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-mzjpj"] Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.313838 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcxx\" (UniqueName: \"kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx\") pod \"auto-csr-approver-29565802-mzjpj\" (UID: \"823a2735-37cf-45cc-9ed3-4c36ffe58b3d\") " pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.415437 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcxx\" (UniqueName: \"kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx\") pod \"auto-csr-approver-29565802-mzjpj\" (UID: \"823a2735-37cf-45cc-9ed3-4c36ffe58b3d\") " pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.433606 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcxx\" (UniqueName: \"kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx\") pod \"auto-csr-approver-29565802-mzjpj\" (UID: \"823a2735-37cf-45cc-9ed3-4c36ffe58b3d\") " pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.512918 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:00 crc kubenswrapper[5033]: I0319 19:22:00.999062 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-mzjpj"] Mar 19 19:22:02 crc kubenswrapper[5033]: I0319 19:22:02.003094 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" event={"ID":"823a2735-37cf-45cc-9ed3-4c36ffe58b3d","Type":"ContainerStarted","Data":"bc4486b0bfc2337f1087385fce2404f3a1302e65887b374397b4f63fd179db8b"} Mar 19 19:22:03 crc kubenswrapper[5033]: I0319 19:22:03.015707 5033 generic.go:334] "Generic (PLEG): container finished" podID="823a2735-37cf-45cc-9ed3-4c36ffe58b3d" containerID="f9ab2001c84e79539289aa76af417be704cbeb56d549bdffae50841944edfb46" exitCode=0 Mar 19 19:22:03 crc kubenswrapper[5033]: I0319 19:22:03.015796 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" event={"ID":"823a2735-37cf-45cc-9ed3-4c36ffe58b3d","Type":"ContainerDied","Data":"f9ab2001c84e79539289aa76af417be704cbeb56d549bdffae50841944edfb46"} Mar 19 19:22:04 crc kubenswrapper[5033]: I0319 19:22:04.864053 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.003631 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npcxx\" (UniqueName: \"kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx\") pod \"823a2735-37cf-45cc-9ed3-4c36ffe58b3d\" (UID: \"823a2735-37cf-45cc-9ed3-4c36ffe58b3d\") " Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.011102 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx" (OuterVolumeSpecName: "kube-api-access-npcxx") pod "823a2735-37cf-45cc-9ed3-4c36ffe58b3d" (UID: "823a2735-37cf-45cc-9ed3-4c36ffe58b3d"). InnerVolumeSpecName "kube-api-access-npcxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.039081 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" event={"ID":"823a2735-37cf-45cc-9ed3-4c36ffe58b3d","Type":"ContainerDied","Data":"bc4486b0bfc2337f1087385fce2404f3a1302e65887b374397b4f63fd179db8b"} Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.039124 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc4486b0bfc2337f1087385fce2404f3a1302e65887b374397b4f63fd179db8b" Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.039156 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-mzjpj" Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.106852 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npcxx\" (UniqueName: \"kubernetes.io/projected/823a2735-37cf-45cc-9ed3-4c36ffe58b3d-kube-api-access-npcxx\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.929512 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-zfzhh"] Mar 19 19:22:05 crc kubenswrapper[5033]: I0319 19:22:05.938493 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-zfzhh"] Mar 19 19:22:06 crc kubenswrapper[5033]: I0319 19:22:06.644640 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf2ccd6-cf13-4015-974e-70e4fb20c374" path="/var/lib/kubelet/pods/6bf2ccd6-cf13-4015-974e-70e4fb20c374/volumes" Mar 19 19:22:10 crc kubenswrapper[5033]: I0319 19:22:10.759070 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:22:10 crc kubenswrapper[5033]: I0319 19:22:10.759732 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:22:10 crc kubenswrapper[5033]: I0319 19:22:10.759783 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:22:10 crc kubenswrapper[5033]: I0319 19:22:10.760668 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:22:10 crc kubenswrapper[5033]: I0319 19:22:10.760733 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce" gracePeriod=600 Mar 19 19:22:11 crc kubenswrapper[5033]: I0319 19:22:11.098166 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce" exitCode=0 Mar 19 19:22:11 crc kubenswrapper[5033]: I0319 19:22:11.098245 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce"} Mar 19 19:22:11 crc kubenswrapper[5033]: I0319 19:22:11.098600 5033 scope.go:117] "RemoveContainer" containerID="ed838a0537bc0b572529bc719cbc8bc412fcc7a7e09e7b88f61a35127f07851f" Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.110373 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6"} Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.817349 5033 scope.go:117] "RemoveContainer" containerID="0016d8f6cd890d8cd54b29cc4d36180c232483682614ac7645d93dfb9decc896" Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.848659 5033 scope.go:117] "RemoveContainer" containerID="cb2b84a659e89d69c2e18fd3512eda93885f551219cb65b36fdc9660beb9a96d" Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.911426 5033 scope.go:117] "RemoveContainer" containerID="63efe028fc693f0085f904b7ee103e19c28c6f8ae305f652d0a11d65dc2237db" Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.959236 5033 scope.go:117] "RemoveContainer" containerID="da77633a261b3813e864d2c099c0f8e3f5cc6e23046d4b8a79f2c1fc0d415695" Mar 19 19:22:12 crc kubenswrapper[5033]: I0319 19:22:12.996338 5033 scope.go:117] "RemoveContainer" containerID="674648697e22b3221647833afbe0ad84f56178bab693cda8b10a513986c21349" Mar 19 19:22:13 crc kubenswrapper[5033]: I0319 19:22:13.027277 5033 scope.go:117] "RemoveContainer" containerID="3ec877d37bd42b2eabbe46a9aa3fe785cc6c03ff251eef151e06328d630d0664" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.138405 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:03 crc kubenswrapper[5033]: E0319 19:23:03.139298 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823a2735-37cf-45cc-9ed3-4c36ffe58b3d" containerName="oc" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.139312 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="823a2735-37cf-45cc-9ed3-4c36ffe58b3d" containerName="oc" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.139509 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="823a2735-37cf-45cc-9ed3-4c36ffe58b3d" containerName="oc" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.141021 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.154189 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.200131 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.200226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.200260 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4p57\" (UniqueName: \"kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.301687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4p57\" (UniqueName: \"kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.301855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.301930 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.302373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.302380 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.335356 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4p57\" (UniqueName: \"kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57\") pod \"redhat-marketplace-d6bzt\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:03 crc kubenswrapper[5033]: I0319 19:23:03.459594 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:04 crc kubenswrapper[5033]: I0319 19:23:04.167128 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:05 crc kubenswrapper[5033]: I0319 19:23:05.077321 5033 generic.go:334] "Generic (PLEG): container finished" podID="f052c470-f66c-4d33-b435-f98ad80d5597" containerID="b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775" exitCode=0 Mar 19 19:23:05 crc kubenswrapper[5033]: I0319 19:23:05.077386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerDied","Data":"b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775"} Mar 19 19:23:05 crc kubenswrapper[5033]: I0319 19:23:05.077669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerStarted","Data":"10271940d9a059220198097c1ac331667b0c21a5d8b0da9f678c06031987b5c7"} Mar 19 19:23:06 crc kubenswrapper[5033]: I0319 19:23:06.088845 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerStarted","Data":"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e"} Mar 19 19:23:07 crc kubenswrapper[5033]: I0319 19:23:07.099808 5033 generic.go:334] "Generic (PLEG): container finished" podID="f052c470-f66c-4d33-b435-f98ad80d5597" containerID="c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e" exitCode=0 Mar 19 19:23:07 crc kubenswrapper[5033]: I0319 19:23:07.099871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerDied","Data":"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e"} Mar 19 19:23:08 crc kubenswrapper[5033]: I0319 19:23:08.113840 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerStarted","Data":"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776"} Mar 19 19:23:08 crc kubenswrapper[5033]: I0319 19:23:08.134820 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d6bzt" podStartSLOduration=2.7452682409999998 podStartE2EDuration="5.134804457s" podCreationTimestamp="2026-03-19 19:23:03 +0000 UTC" firstStartedPulling="2026-03-19 19:23:05.07960595 +0000 UTC m=+1595.184635799" lastFinishedPulling="2026-03-19 19:23:07.469142166 +0000 UTC m=+1597.574172015" observedRunningTime="2026-03-19 19:23:08.132577763 +0000 UTC m=+1598.237607602" watchObservedRunningTime="2026-03-19 19:23:08.134804457 +0000 UTC m=+1598.239834306" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.170134 5033 scope.go:117] "RemoveContainer" containerID="aa2a4b7e07720b3e718c3466ef517985100d4fc5af6da3642c87053908055692" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.206440 5033 scope.go:117] "RemoveContainer" containerID="7505998b6596770a09e952b8929261129dc92a64475fc6f10608fd489414c08e" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.249857 5033 scope.go:117] "RemoveContainer" containerID="72f15a7ac2c4129a38ffe07a8d6244a8da70fb44210764b9531ef198c1406225" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.275797 5033 scope.go:117] "RemoveContainer" containerID="ada215b2b7e2af73c6ea402dd19ea7be8fa32fcb4d5a687f9dcbbdc15b5a29e1" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.460648 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.460701 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:13 crc kubenswrapper[5033]: I0319 19:23:13.504091 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:14 crc kubenswrapper[5033]: I0319 19:23:14.216801 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:14 crc kubenswrapper[5033]: I0319 19:23:14.262867 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:16 crc kubenswrapper[5033]: I0319 19:23:16.188551 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d6bzt" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="registry-server" containerID="cri-o://4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776" gracePeriod=2 Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.131513 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.207064 5033 generic.go:334] "Generic (PLEG): container finished" podID="f052c470-f66c-4d33-b435-f98ad80d5597" containerID="4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776" exitCode=0 Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.207099 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerDied","Data":"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776"} Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.207124 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6bzt" event={"ID":"f052c470-f66c-4d33-b435-f98ad80d5597","Type":"ContainerDied","Data":"10271940d9a059220198097c1ac331667b0c21a5d8b0da9f678c06031987b5c7"} Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.207141 5033 scope.go:117] "RemoveContainer" containerID="4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.207248 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6bzt" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.216120 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities\") pod \"f052c470-f66c-4d33-b435-f98ad80d5597\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.216230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4p57\" (UniqueName: \"kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57\") pod \"f052c470-f66c-4d33-b435-f98ad80d5597\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.216313 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content\") pod \"f052c470-f66c-4d33-b435-f98ad80d5597\" (UID: \"f052c470-f66c-4d33-b435-f98ad80d5597\") " Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.216796 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities" (OuterVolumeSpecName: "utilities") pod "f052c470-f66c-4d33-b435-f98ad80d5597" (UID: "f052c470-f66c-4d33-b435-f98ad80d5597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.216978 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.223729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57" (OuterVolumeSpecName: "kube-api-access-j4p57") pod "f052c470-f66c-4d33-b435-f98ad80d5597" (UID: "f052c470-f66c-4d33-b435-f98ad80d5597"). InnerVolumeSpecName "kube-api-access-j4p57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.251903 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f052c470-f66c-4d33-b435-f98ad80d5597" (UID: "f052c470-f66c-4d33-b435-f98ad80d5597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.258411 5033 scope.go:117] "RemoveContainer" containerID="c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.318441 5033 scope.go:117] "RemoveContainer" containerID="b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.318577 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4p57\" (UniqueName: \"kubernetes.io/projected/f052c470-f66c-4d33-b435-f98ad80d5597-kube-api-access-j4p57\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.318622 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f052c470-f66c-4d33-b435-f98ad80d5597-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.360536 5033 scope.go:117] "RemoveContainer" containerID="4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776" Mar 19 19:23:17 crc kubenswrapper[5033]: E0319 19:23:17.360971 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776\": container with ID starting with 4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776 not found: ID does not exist" containerID="4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.361000 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776"} err="failed to get container status \"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776\": rpc error: code = NotFound desc = could not find container \"4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776\": container with ID starting with 4a2917c7cc01b5d3fc7289a013a44227ffc554047915ef1743e61f7ffcdb9776 not found: ID does not exist" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.361021 5033 scope.go:117] "RemoveContainer" containerID="c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e" Mar 19 19:23:17 crc kubenswrapper[5033]: E0319 19:23:17.361482 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e\": container with ID starting with c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e not found: ID does not exist" containerID="c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.361520 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e"} err="failed to get container status \"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e\": rpc error: code = NotFound desc = could not find container \"c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e\": container with ID starting with c759d13722f8e897dd0e6004afaaf1a29bf6ee206d25b38fbba766e45451ed5e not found: ID does not exist" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.361546 5033 scope.go:117] "RemoveContainer" containerID="b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775" Mar 19 19:23:17 crc kubenswrapper[5033]: E0319 19:23:17.362000 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775\": container with ID starting with b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775 not found: ID does not exist" containerID="b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.362028 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775"} err="failed to get container status \"b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775\": rpc error: code = NotFound desc = could not find container \"b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775\": container with ID starting with b2107dccc8c37d180613fde00b77ec2753831a46abba1708bb122163902a1775 not found: ID does not exist" Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.539584 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:17 crc kubenswrapper[5033]: I0319 19:23:17.551390 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6bzt"] Mar 19 19:23:18 crc kubenswrapper[5033]: I0319 19:23:18.632521 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" path="/var/lib/kubelet/pods/f052c470-f66c-4d33-b435-f98ad80d5597/volumes" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.159207 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565804-47w5s"] Mar 19 19:24:00 crc kubenswrapper[5033]: E0319 19:24:00.160077 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="registry-server" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.160088 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="registry-server" Mar 19 19:24:00 crc kubenswrapper[5033]: E0319 19:24:00.160104 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="extract-utilities" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.160110 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="extract-utilities" Mar 19 19:24:00 crc kubenswrapper[5033]: E0319 19:24:00.160132 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="extract-content" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.160138 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="extract-content" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.160337 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f052c470-f66c-4d33-b435-f98ad80d5597" containerName="registry-server" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.161008 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.166943 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.166984 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.167051 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.170996 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-47w5s"] Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.255711 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw88c\" (UniqueName: \"kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c\") pod \"auto-csr-approver-29565804-47w5s\" (UID: \"56560b0d-1894-4ef0-a563-e3eefed012b8\") " pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.358028 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw88c\" (UniqueName: \"kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c\") pod \"auto-csr-approver-29565804-47w5s\" (UID: \"56560b0d-1894-4ef0-a563-e3eefed012b8\") " pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.382526 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw88c\" (UniqueName: \"kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c\") pod \"auto-csr-approver-29565804-47w5s\" (UID: \"56560b0d-1894-4ef0-a563-e3eefed012b8\") " pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:00 crc kubenswrapper[5033]: I0319 19:24:00.479953 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:01 crc kubenswrapper[5033]: I0319 19:24:01.249103 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-47w5s"] Mar 19 19:24:02 crc kubenswrapper[5033]: I0319 19:24:02.057309 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-47w5s" event={"ID":"56560b0d-1894-4ef0-a563-e3eefed012b8","Type":"ContainerStarted","Data":"fd56a1cb1fe99e7676fcc30e52e366b598849d55ba01f53e31bb2f7a0cea2f78"} Mar 19 19:24:03 crc kubenswrapper[5033]: I0319 19:24:03.067648 5033 generic.go:334] "Generic (PLEG): container finished" podID="56560b0d-1894-4ef0-a563-e3eefed012b8" containerID="9435823276ba60fe0d014e94bdaa2429445483e766bcb33e12f00030a31c1f7a" exitCode=0 Mar 19 19:24:03 crc kubenswrapper[5033]: I0319 19:24:03.067702 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-47w5s" event={"ID":"56560b0d-1894-4ef0-a563-e3eefed012b8","Type":"ContainerDied","Data":"9435823276ba60fe0d014e94bdaa2429445483e766bcb33e12f00030a31c1f7a"} Mar 19 19:24:04 crc kubenswrapper[5033]: I0319 19:24:04.951100 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.058396 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw88c\" (UniqueName: \"kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c\") pod \"56560b0d-1894-4ef0-a563-e3eefed012b8\" (UID: \"56560b0d-1894-4ef0-a563-e3eefed012b8\") " Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.063882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c" (OuterVolumeSpecName: "kube-api-access-sw88c") pod "56560b0d-1894-4ef0-a563-e3eefed012b8" (UID: "56560b0d-1894-4ef0-a563-e3eefed012b8"). InnerVolumeSpecName "kube-api-access-sw88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.090147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-47w5s" event={"ID":"56560b0d-1894-4ef0-a563-e3eefed012b8","Type":"ContainerDied","Data":"fd56a1cb1fe99e7676fcc30e52e366b598849d55ba01f53e31bb2f7a0cea2f78"} Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.090182 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-47w5s" Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.090187 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd56a1cb1fe99e7676fcc30e52e366b598849d55ba01f53e31bb2f7a0cea2f78" Mar 19 19:24:05 crc kubenswrapper[5033]: I0319 19:24:05.160853 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw88c\" (UniqueName: \"kubernetes.io/projected/56560b0d-1894-4ef0-a563-e3eefed012b8-kube-api-access-sw88c\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:06 crc kubenswrapper[5033]: I0319 19:24:06.019625 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-7lv97"] Mar 19 19:24:06 crc kubenswrapper[5033]: I0319 19:24:06.031184 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-7lv97"] Mar 19 19:24:06 crc kubenswrapper[5033]: I0319 19:24:06.631389 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e45587-5c12-422b-850f-782805c2169a" path="/var/lib/kubelet/pods/96e45587-5c12-422b-850f-782805c2169a/volumes" Mar 19 19:24:11 crc kubenswrapper[5033]: I0319 19:24:11.138481 5033 generic.go:334] "Generic (PLEG): container finished" podID="78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" containerID="6eb4f30d4f84cbec07f7ba3d0a43ddfbaaa26f6d3b07bd647fbdf54ba091170b" exitCode=0 Mar 19 19:24:11 crc kubenswrapper[5033]: I0319 19:24:11.138584 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" event={"ID":"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803","Type":"ContainerDied","Data":"6eb4f30d4f84cbec07f7ba3d0a43ddfbaaa26f6d3b07bd647fbdf54ba091170b"} Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.008870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.117135 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vvrx\" (UniqueName: \"kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx\") pod \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.117242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam\") pod \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.117327 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory\") pod \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.117422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle\") pod \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\" (UID: \"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803\") " Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.127242 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx" (OuterVolumeSpecName: "kube-api-access-6vvrx") pod "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" (UID: "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803"). InnerVolumeSpecName "kube-api-access-6vvrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.127391 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" (UID: "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.157482 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" event={"ID":"78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803","Type":"ContainerDied","Data":"e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b"} Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.157526 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69dd2cba9ed9dcf6192068637f0bbe023cb37136050c443bd83226d8f71f79b" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.157587 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-snswf" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.159027 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" (UID: "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.168499 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory" (OuterVolumeSpecName: "inventory") pod "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" (UID: "78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.220286 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vvrx\" (UniqueName: \"kubernetes.io/projected/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-kube-api-access-6vvrx\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.220334 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.220347 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.220360 5033 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.243945 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt"] Mar 19 19:24:13 crc kubenswrapper[5033]: E0319 19:24:13.244330 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.244346 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:24:13 crc kubenswrapper[5033]: E0319 19:24:13.244370 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56560b0d-1894-4ef0-a563-e3eefed012b8" containerName="oc" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.244376 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56560b0d-1894-4ef0-a563-e3eefed012b8" containerName="oc" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.244599 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="56560b0d-1894-4ef0-a563-e3eefed012b8" containerName="oc" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.244626 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.245345 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.271873 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt"] Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.322438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.322553 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwwq\" (UniqueName: \"kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.322645 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.389593 5033 scope.go:117] "RemoveContainer" containerID="a82629c2564dd1ad6bd44e6f1be1bfd8f75714bca8c9c549103ad5a2cc961820" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.424832 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.424897 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwwq\" (UniqueName: \"kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.424955 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.430141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.430854 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.436621 5033 scope.go:117] "RemoveContainer" containerID="b5c24e6516126f026933d2df5fa1c622d696f28caae1dcf12d07b6103ab1598f" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.440975 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwwq\" (UniqueName: \"kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mfddt\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.469063 5033 scope.go:117] "RemoveContainer" containerID="01b87216f0e2a1bd6d0958ad72e7a51288cf22b548c2c96aa94b7d9306835880" Mar 19 19:24:13 crc kubenswrapper[5033]: I0319 19:24:13.576597 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:24:14 crc kubenswrapper[5033]: I0319 19:24:14.151519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt"] Mar 19 19:24:14 crc kubenswrapper[5033]: I0319 19:24:14.159464 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:24:14 crc kubenswrapper[5033]: I0319 19:24:14.168321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" event={"ID":"02b8790f-5100-462c-972a-ab03fa3e53fa","Type":"ContainerStarted","Data":"525cf76347f177c212a67263a0488446e8dcd0d30c22937047b02d82d9bfac71"} Mar 19 19:24:15 crc kubenswrapper[5033]: I0319 19:24:15.178310 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" event={"ID":"02b8790f-5100-462c-972a-ab03fa3e53fa","Type":"ContainerStarted","Data":"c573e14830618e6a17cecfad5055bd4448dcc9adcac7ab9591b0ee085934b55f"} Mar 19 19:24:15 crc kubenswrapper[5033]: I0319 19:24:15.202971 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" podStartSLOduration=1.7661110039999999 podStartE2EDuration="2.202947193s" podCreationTimestamp="2026-03-19 19:24:13 +0000 UTC" firstStartedPulling="2026-03-19 19:24:14.15920482 +0000 UTC m=+1664.264234669" lastFinishedPulling="2026-03-19 19:24:14.596041009 +0000 UTC m=+1664.701070858" observedRunningTime="2026-03-19 19:24:15.19221231 +0000 UTC m=+1665.297242159" watchObservedRunningTime="2026-03-19 19:24:15.202947193 +0000 UTC m=+1665.307977042" Mar 19 19:24:40 crc kubenswrapper[5033]: I0319 19:24:40.758422 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:24:40 crc kubenswrapper[5033]: I0319 19:24:40.758932 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.056178 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-f8tqn"] Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.068169 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9cns8"] Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.080089 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kd7bj"] Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.090411 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kd7bj"] Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.105215 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-f8tqn"] Mar 19 19:24:47 crc kubenswrapper[5033]: I0319 19:24:47.119964 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9cns8"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.056281 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-37d3-account-create-update-wsdp9"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.073668 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a359-account-create-update-sdjdx"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.089124 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6546-account-create-update-dg5vw"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.099309 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6546-account-create-update-dg5vw"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.120937 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-37d3-account-create-update-wsdp9"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.130099 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a359-account-create-update-sdjdx"] Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.634669 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffbb960-0f7e-4a03-9796-db1e3073d08e" path="/var/lib/kubelet/pods/5ffbb960-0f7e-4a03-9796-db1e3073d08e/volumes" Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.635578 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630d3b9e-00d5-4627-901b-958ff5a2aca6" path="/var/lib/kubelet/pods/630d3b9e-00d5-4627-901b-958ff5a2aca6/volumes" Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.636112 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755b22ef-8257-4051-b51f-88ad1249cf11" path="/var/lib/kubelet/pods/755b22ef-8257-4051-b51f-88ad1249cf11/volumes" Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.636800 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9143c6bb-985c-4b14-abf1-1813e54136cc" path="/var/lib/kubelet/pods/9143c6bb-985c-4b14-abf1-1813e54136cc/volumes" Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.637880 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9308b09f-03b2-4866-a458-26c8de752ae1" path="/var/lib/kubelet/pods/9308b09f-03b2-4866-a458-26c8de752ae1/volumes" Mar 19 19:24:48 crc kubenswrapper[5033]: I0319 19:24:48.638376 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b2953d-cccb-4334-9be9-a8a884cfb7a1" path="/var/lib/kubelet/pods/c8b2953d-cccb-4334-9be9-a8a884cfb7a1/volumes" Mar 19 19:25:08 crc kubenswrapper[5033]: I0319 19:25:08.069926 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6fd1-account-create-update-pc8fs"] Mar 19 19:25:08 crc kubenswrapper[5033]: I0319 19:25:08.081253 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6fd1-account-create-update-pc8fs"] Mar 19 19:25:08 crc kubenswrapper[5033]: I0319 19:25:08.631002 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d403cb3-1990-4495-8979-0b3f0593ccc1" path="/var/lib/kubelet/pods/1d403cb3-1990-4495-8979-0b3f0593ccc1/volumes" Mar 19 19:25:09 crc kubenswrapper[5033]: I0319 19:25:09.034685 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qc584"] Mar 19 19:25:09 crc kubenswrapper[5033]: I0319 19:25:09.045817 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-v74jb"] Mar 19 19:25:09 crc kubenswrapper[5033]: I0319 19:25:09.056581 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qc584"] Mar 19 19:25:09 crc kubenswrapper[5033]: I0319 19:25:09.065957 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-v74jb"] Mar 19 19:25:10 crc kubenswrapper[5033]: I0319 19:25:10.632947 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b698449-e87b-49b6-81d8-f101ce8304c9" path="/var/lib/kubelet/pods/0b698449-e87b-49b6-81d8-f101ce8304c9/volumes" Mar 19 19:25:10 crc kubenswrapper[5033]: I0319 19:25:10.633796 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a839076c-968b-4a81-9e49-f6c691b95dd6" path="/var/lib/kubelet/pods/a839076c-968b-4a81-9e49-f6c691b95dd6/volumes" Mar 19 19:25:10 crc kubenswrapper[5033]: I0319 19:25:10.758648 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:25:10 crc kubenswrapper[5033]: I0319 19:25:10.758722 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.031637 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6e4d-account-create-update-bccmp"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.044948 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6fqmp"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.054417 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-xnstv"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.062554 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6e4d-account-create-update-bccmp"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.071074 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6fqmp"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.085984 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-xnstv"] Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.630985 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa4e771-cd1c-490f-bda5-4b30be140739" path="/var/lib/kubelet/pods/2fa4e771-cd1c-490f-bda5-4b30be140739/volumes" Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.631845 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64138b55-5be9-44e1-9663-5789ff9b51fd" path="/var/lib/kubelet/pods/64138b55-5be9-44e1-9663-5789ff9b51fd/volumes" Mar 19 19:25:12 crc kubenswrapper[5033]: I0319 19:25:12.632347 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b518db5b-e766-4d03-94b4-b3d72b3edbae" path="/var/lib/kubelet/pods/b518db5b-e766-4d03-94b4-b3d72b3edbae/volumes" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.115434 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-j9qgq"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.128967 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-292e-account-create-update-vzxnc"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.139156 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5391-account-create-update-wd7jh"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.148493 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5391-account-create-update-wd7jh"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.164691 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-292e-account-create-update-vzxnc"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.176338 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-j9qgq"] Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.544386 5033 scope.go:117] "RemoveContainer" containerID="3a08f21d3ea9528e3e93595063a06df4de4143e89a4f3a94ce3997861bbfe82b" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.574721 5033 scope.go:117] "RemoveContainer" containerID="b094d9d30f70d26d4b984c7ba5580d9211d15d98e225604356a9421169565fb7" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.626301 5033 scope.go:117] "RemoveContainer" containerID="ecc61e2309608f8eaa203a5546470634e17324237a7300753f20c38db7d2affe" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.678876 5033 scope.go:117] "RemoveContainer" containerID="4ad83655ca16a46b659927a74732ee5d396c77654ee09126f44adaee4c46ae5f" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.726153 5033 scope.go:117] "RemoveContainer" containerID="f1196529a345f04ce08891b6e1fab7d1896f67f80406e51b21c1e97290265e7e" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.773984 5033 scope.go:117] "RemoveContainer" containerID="4a37194d6d76a51ef090bb4212f544a6651602c96b5f61fba76eda4bb51794b4" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.824510 5033 scope.go:117] "RemoveContainer" containerID="0ce7e5743025c4198fea15cc250e53094640f6228eb009af102dcc0084740c55" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.844587 5033 scope.go:117] "RemoveContainer" containerID="013ecc38ffbe5625f4b21a384258763e8f955c6fe52b0c4583f257050a99ea8d" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.868752 5033 scope.go:117] "RemoveContainer" containerID="44c231df158f80b1b4587398f06347870bb684500186d6aa5fcff8717a989758" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.894572 5033 scope.go:117] "RemoveContainer" containerID="ee19acbfe70b1c87f13991876f338bf2bdd777fc62deead4eb9c41cda9864526" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.917278 5033 scope.go:117] "RemoveContainer" containerID="92374f44c7cd1850a79ddede9b9a4f43d3739e54fb9ed3df3f64befbc38bed71" Mar 19 19:25:13 crc kubenswrapper[5033]: I0319 19:25:13.948316 5033 scope.go:117] "RemoveContainer" containerID="b64c4ff3d3e6593afa889465ec3697ca7df32e286b935c4ff9b1fff18ce7d9f6" Mar 19 19:25:14 crc kubenswrapper[5033]: I0319 19:25:14.632705 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c873387-43a8-4d46-9499-8625c10a5e6d" path="/var/lib/kubelet/pods/1c873387-43a8-4d46-9499-8625c10a5e6d/volumes" Mar 19 19:25:14 crc kubenswrapper[5033]: I0319 19:25:14.633242 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a241c55-520e-441e-97fa-137e8161daa3" path="/var/lib/kubelet/pods/8a241c55-520e-441e-97fa-137e8161daa3/volumes" Mar 19 19:25:14 crc kubenswrapper[5033]: I0319 19:25:14.633783 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e994ae-73e2-439b-bb97-c00e133d03cb" path="/var/lib/kubelet/pods/c8e994ae-73e2-439b-bb97-c00e133d03cb/volumes" Mar 19 19:25:15 crc kubenswrapper[5033]: I0319 19:25:15.026425 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mj7zf"] Mar 19 19:25:15 crc kubenswrapper[5033]: I0319 19:25:15.035393 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mj7zf"] Mar 19 19:25:16 crc kubenswrapper[5033]: I0319 19:25:16.631533 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8112e00f-afca-47a3-b233-b8282cccf396" path="/var/lib/kubelet/pods/8112e00f-afca-47a3-b233-b8282cccf396/volumes" Mar 19 19:25:18 crc kubenswrapper[5033]: I0319 19:25:18.036205 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kpzv2"] Mar 19 19:25:18 crc kubenswrapper[5033]: I0319 19:25:18.046832 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kpzv2"] Mar 19 19:25:18 crc kubenswrapper[5033]: I0319 19:25:18.632965 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4633785-c4bb-4b69-9383-e479734c029f" path="/var/lib/kubelet/pods/b4633785-c4bb-4b69-9383-e479734c029f/volumes" Mar 19 19:25:40 crc kubenswrapper[5033]: I0319 19:25:40.758562 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:25:40 crc kubenswrapper[5033]: I0319 19:25:40.759171 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:25:40 crc kubenswrapper[5033]: I0319 19:25:40.759217 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:25:40 crc kubenswrapper[5033]: I0319 19:25:40.760001 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:25:40 crc kubenswrapper[5033]: I0319 19:25:40.760042 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" gracePeriod=600 Mar 19 19:25:40 crc kubenswrapper[5033]: E0319 19:25:40.886426 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:25:41 crc kubenswrapper[5033]: I0319 19:25:41.191844 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" exitCode=0 Mar 19 19:25:41 crc kubenswrapper[5033]: I0319 19:25:41.191894 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6"} Mar 19 19:25:41 crc kubenswrapper[5033]: I0319 19:25:41.191929 5033 scope.go:117] "RemoveContainer" containerID="e4a5820860eabe0e21fa70a052085003d897754c14b00a03283d395b7e02cdce" Mar 19 19:25:41 crc kubenswrapper[5033]: I0319 19:25:41.192837 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:25:41 crc kubenswrapper[5033]: E0319 19:25:41.193216 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:25:45 crc kubenswrapper[5033]: I0319 19:25:45.043708 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vq5tm"] Mar 19 19:25:45 crc kubenswrapper[5033]: I0319 19:25:45.055670 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vq5tm"] Mar 19 19:25:46 crc kubenswrapper[5033]: I0319 19:25:46.253960 5033 generic.go:334] "Generic (PLEG): container finished" podID="02b8790f-5100-462c-972a-ab03fa3e53fa" containerID="c573e14830618e6a17cecfad5055bd4448dcc9adcac7ab9591b0ee085934b55f" exitCode=0 Mar 19 19:25:46 crc kubenswrapper[5033]: I0319 19:25:46.254077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" event={"ID":"02b8790f-5100-462c-972a-ab03fa3e53fa","Type":"ContainerDied","Data":"c573e14830618e6a17cecfad5055bd4448dcc9adcac7ab9591b0ee085934b55f"} Mar 19 19:25:46 crc kubenswrapper[5033]: I0319 19:25:46.643264 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cf8f72-4051-495d-970c-388cbd48a0bb" path="/var/lib/kubelet/pods/97cf8f72-4051-495d-970c-388cbd48a0bb/volumes" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.269781 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.272017 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" event={"ID":"02b8790f-5100-462c-972a-ab03fa3e53fa","Type":"ContainerDied","Data":"525cf76347f177c212a67263a0488446e8dcd0d30c22937047b02d82d9bfac71"} Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.272364 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="525cf76347f177c212a67263a0488446e8dcd0d30c22937047b02d82d9bfac71" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.272055 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mfddt" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.331102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam\") pod \"02b8790f-5100-462c-972a-ab03fa3e53fa\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.331146 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwwq\" (UniqueName: \"kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq\") pod \"02b8790f-5100-462c-972a-ab03fa3e53fa\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.331183 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory\") pod \"02b8790f-5100-462c-972a-ab03fa3e53fa\" (UID: \"02b8790f-5100-462c-972a-ab03fa3e53fa\") " Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.337571 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq" (OuterVolumeSpecName: "kube-api-access-4fwwq") pod "02b8790f-5100-462c-972a-ab03fa3e53fa" (UID: "02b8790f-5100-462c-972a-ab03fa3e53fa"). InnerVolumeSpecName "kube-api-access-4fwwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.369040 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02b8790f-5100-462c-972a-ab03fa3e53fa" (UID: "02b8790f-5100-462c-972a-ab03fa3e53fa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.369605 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory" (OuterVolumeSpecName: "inventory") pod "02b8790f-5100-462c-972a-ab03fa3e53fa" (UID: "02b8790f-5100-462c-972a-ab03fa3e53fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.434193 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.434224 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwwq\" (UniqueName: \"kubernetes.io/projected/02b8790f-5100-462c-972a-ab03fa3e53fa-kube-api-access-4fwwq\") on node \"crc\" DevicePath \"\"" Mar 19 19:25:48 crc kubenswrapper[5033]: I0319 19:25:48.434237 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02b8790f-5100-462c-972a-ab03fa3e53fa-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.369241 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv"] Mar 19 19:25:49 crc kubenswrapper[5033]: E0319 19:25:49.374712 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b8790f-5100-462c-972a-ab03fa3e53fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.374765 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b8790f-5100-462c-972a-ab03fa3e53fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.375400 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b8790f-5100-462c-972a-ab03fa3e53fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.376950 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.383267 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.383302 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.383535 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.383855 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.406282 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv"] Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.453205 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mplqm\" (UniqueName: \"kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.453298 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.453378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.555271 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.555405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.555544 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mplqm\" (UniqueName: \"kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.562290 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.562328 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.574952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mplqm\" (UniqueName: \"kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:49 crc kubenswrapper[5033]: I0319 19:25:49.731517 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:25:50 crc kubenswrapper[5033]: I0319 19:25:50.232220 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv"] Mar 19 19:25:50 crc kubenswrapper[5033]: I0319 19:25:50.289315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" event={"ID":"7e310e51-d5e9-4a0d-9ac7-246d34af93b5","Type":"ContainerStarted","Data":"81c432283d62073174887c0b31f0ad20b2255cf969bdf49870854048a59d863b"} Mar 19 19:25:51 crc kubenswrapper[5033]: I0319 19:25:51.301002 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" event={"ID":"7e310e51-d5e9-4a0d-9ac7-246d34af93b5","Type":"ContainerStarted","Data":"86ebcb43cb81d181d3a2dcfdcedec97266da77cea1d5c442dab270ccd48708ab"} Mar 19 19:25:51 crc kubenswrapper[5033]: I0319 19:25:51.321194 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" podStartSLOduration=1.796738645 podStartE2EDuration="2.321164696s" podCreationTimestamp="2026-03-19 19:25:49 +0000 UTC" firstStartedPulling="2026-03-19 19:25:50.235239537 +0000 UTC m=+1760.340269386" lastFinishedPulling="2026-03-19 19:25:50.759665588 +0000 UTC m=+1760.864695437" observedRunningTime="2026-03-19 19:25:51.313404187 +0000 UTC m=+1761.418434076" watchObservedRunningTime="2026-03-19 19:25:51.321164696 +0000 UTC m=+1761.426194575" Mar 19 19:25:52 crc kubenswrapper[5033]: I0319 19:25:52.620266 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:25:52 crc kubenswrapper[5033]: E0319 19:25:52.620617 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.031005 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6r6gh"] Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.046108 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dz8np"] Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.054959 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6r6gh"] Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.063948 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dz8np"] Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.634252 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e9fb01-0fe4-4bf2-9387-5919f16fb3ea" path="/var/lib/kubelet/pods/39e9fb01-0fe4-4bf2-9387-5919f16fb3ea/volumes" Mar 19 19:25:58 crc kubenswrapper[5033]: I0319 19:25:58.635255 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9" path="/var/lib/kubelet/pods/8d9bcf9f-a1d1-4031-a560-5c5d01dd76e9/volumes" Mar 19 19:25:59 crc kubenswrapper[5033]: I0319 19:25:59.047594 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t5tfl"] Mar 19 19:25:59 crc kubenswrapper[5033]: I0319 19:25:59.059219 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t5tfl"] Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.146382 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565806-w24s6"] Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.149348 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.152215 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.152220 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.154302 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.161791 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-w24s6"] Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.278837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnt7\" (UniqueName: \"kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7\") pod \"auto-csr-approver-29565806-w24s6\" (UID: \"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c\") " pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.381147 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnt7\" (UniqueName: \"kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7\") pod \"auto-csr-approver-29565806-w24s6\" (UID: \"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c\") " pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.417736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnt7\" (UniqueName: \"kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7\") pod \"auto-csr-approver-29565806-w24s6\" (UID: \"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c\") " pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.494184 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.636285 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b4c780-bc31-4138-9aa2-3d47604bc88f" path="/var/lib/kubelet/pods/04b4c780-bc31-4138-9aa2-3d47604bc88f/volumes" Mar 19 19:26:00 crc kubenswrapper[5033]: I0319 19:26:00.976993 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-w24s6"] Mar 19 19:26:01 crc kubenswrapper[5033]: I0319 19:26:01.398337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-w24s6" event={"ID":"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c","Type":"ContainerStarted","Data":"b4b795e0313fa58ece8ec0244d0063785218f8d3069d8f911b0043062de88311"} Mar 19 19:26:02 crc kubenswrapper[5033]: I0319 19:26:02.413174 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-w24s6" event={"ID":"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c","Type":"ContainerStarted","Data":"58c2a5a536b777c21d76b253bfb70bea4e61315e54c211496b556d31f0b2fb39"} Mar 19 19:26:02 crc kubenswrapper[5033]: I0319 19:26:02.437504 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565806-w24s6" podStartSLOduration=1.442958331 podStartE2EDuration="2.437475599s" podCreationTimestamp="2026-03-19 19:26:00 +0000 UTC" firstStartedPulling="2026-03-19 19:26:00.989428452 +0000 UTC m=+1771.094458301" lastFinishedPulling="2026-03-19 19:26:01.98394572 +0000 UTC m=+1772.088975569" observedRunningTime="2026-03-19 19:26:02.432634842 +0000 UTC m=+1772.537664691" watchObservedRunningTime="2026-03-19 19:26:02.437475599 +0000 UTC m=+1772.542505458" Mar 19 19:26:03 crc kubenswrapper[5033]: I0319 19:26:03.423693 5033 generic.go:334] "Generic (PLEG): container finished" podID="a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" containerID="58c2a5a536b777c21d76b253bfb70bea4e61315e54c211496b556d31f0b2fb39" exitCode=0 Mar 19 19:26:03 crc kubenswrapper[5033]: I0319 19:26:03.423804 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-w24s6" event={"ID":"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c","Type":"ContainerDied","Data":"58c2a5a536b777c21d76b253bfb70bea4e61315e54c211496b556d31f0b2fb39"} Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.313618 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.395821 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnnt7\" (UniqueName: \"kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7\") pod \"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c\" (UID: \"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c\") " Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.401695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7" (OuterVolumeSpecName: "kube-api-access-qnnt7") pod "a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" (UID: "a2aaa5eb-4661-4a7f-9aa8-37dcde93526c"). InnerVolumeSpecName "kube-api-access-qnnt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.444873 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-w24s6" event={"ID":"a2aaa5eb-4661-4a7f-9aa8-37dcde93526c","Type":"ContainerDied","Data":"b4b795e0313fa58ece8ec0244d0063785218f8d3069d8f911b0043062de88311"} Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.444920 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b795e0313fa58ece8ec0244d0063785218f8d3069d8f911b0043062de88311" Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.444931 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-w24s6" Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.506235 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnnt7\" (UniqueName: \"kubernetes.io/projected/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c-kube-api-access-qnnt7\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.510405 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-ctlbb"] Mar 19 19:26:05 crc kubenswrapper[5033]: I0319 19:26:05.519588 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-ctlbb"] Mar 19 19:26:06 crc kubenswrapper[5033]: I0319 19:26:06.645169 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c6e8dd-3762-416c-af0e-99322b5561c3" path="/var/lib/kubelet/pods/f1c6e8dd-3762-416c-af0e-99322b5561c3/volumes" Mar 19 19:26:07 crc kubenswrapper[5033]: I0319 19:26:07.621402 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:26:07 crc kubenswrapper[5033]: E0319 19:26:07.621784 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:26:09 crc kubenswrapper[5033]: I0319 19:26:09.032269 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2hbc5"] Mar 19 19:26:09 crc kubenswrapper[5033]: I0319 19:26:09.041023 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2hbc5"] Mar 19 19:26:10 crc kubenswrapper[5033]: I0319 19:26:10.631127 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715f11c5-fa05-4d4a-9d52-a21479ced465" path="/var/lib/kubelet/pods/715f11c5-fa05-4d4a-9d52-a21479ced465/volumes" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.166620 5033 scope.go:117] "RemoveContainer" containerID="3265d8b4452c653630bb972dab0f49ae5a5c4e688f18570c6c3cbbbeecb9f7f3" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.200643 5033 scope.go:117] "RemoveContainer" containerID="31f70b8f5a1a3014397d27ff5c6c0d76b418cb9d3bb3dc81f914aa5912fb671f" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.241852 5033 scope.go:117] "RemoveContainer" containerID="6cc5c3b8126b2c8c1fa31e4b66fc877eb0ed4271f9e8bff59b84c82196b2b2c3" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.286375 5033 scope.go:117] "RemoveContainer" containerID="5be6e2b12848c903b53c0d68a7b095c033cb04f6b1225992ec4af147b25a29a4" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.356784 5033 scope.go:117] "RemoveContainer" containerID="ec3c35d6f5f4748a119ed1fec4639a9c30c8d7239c0a61dbdcd0302a98af604c" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.383065 5033 scope.go:117] "RemoveContainer" containerID="d7910b8137230053464c6be9e00567c6796c1fc6a1d97a916e8ff5cef9431627" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.447503 5033 scope.go:117] "RemoveContainer" containerID="d9e60797cf5580a8928dd2f31e8f8810eda9729f856428040c13ef15c8e0986f" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.478327 5033 scope.go:117] "RemoveContainer" containerID="97aa45a9df5b72d0bf0a298a212e82dd7ad883939a6cdb9e0379358ce8048c49" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.498539 5033 scope.go:117] "RemoveContainer" containerID="f86cff58c74f47f0e364cdf06382332179b7eb8928d038c55623ea9f670b3098" Mar 19 19:26:14 crc kubenswrapper[5033]: I0319 19:26:14.515444 5033 scope.go:117] "RemoveContainer" containerID="b5eef0a406f9d3c72679a385009cdc60fdc8bd7a1b40d7e8a71f1de0d0c2eec0" Mar 19 19:26:22 crc kubenswrapper[5033]: I0319 19:26:22.620680 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:26:22 crc kubenswrapper[5033]: E0319 19:26:22.621525 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:26:34 crc kubenswrapper[5033]: I0319 19:26:34.623288 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:26:34 crc kubenswrapper[5033]: E0319 19:26:34.623925 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:26:49 crc kubenswrapper[5033]: I0319 19:26:49.620790 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:26:49 crc kubenswrapper[5033]: E0319 19:26:49.621616 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:26:52 crc kubenswrapper[5033]: I0319 19:26:52.915236 5033 generic.go:334] "Generic (PLEG): container finished" podID="7e310e51-d5e9-4a0d-9ac7-246d34af93b5" containerID="86ebcb43cb81d181d3a2dcfdcedec97266da77cea1d5c442dab270ccd48708ab" exitCode=0 Mar 19 19:26:52 crc kubenswrapper[5033]: I0319 19:26:52.915326 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" event={"ID":"7e310e51-d5e9-4a0d-9ac7-246d34af93b5","Type":"ContainerDied","Data":"86ebcb43cb81d181d3a2dcfdcedec97266da77cea1d5c442dab270ccd48708ab"} Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.644186 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.740585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mplqm\" (UniqueName: \"kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm\") pod \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.740705 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam\") pod \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.740826 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory\") pod \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\" (UID: \"7e310e51-d5e9-4a0d-9ac7-246d34af93b5\") " Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.775275 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm" (OuterVolumeSpecName: "kube-api-access-mplqm") pod "7e310e51-d5e9-4a0d-9ac7-246d34af93b5" (UID: "7e310e51-d5e9-4a0d-9ac7-246d34af93b5"). InnerVolumeSpecName "kube-api-access-mplqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.787822 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory" (OuterVolumeSpecName: "inventory") pod "7e310e51-d5e9-4a0d-9ac7-246d34af93b5" (UID: "7e310e51-d5e9-4a0d-9ac7-246d34af93b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.801280 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e310e51-d5e9-4a0d-9ac7-246d34af93b5" (UID: "7e310e51-d5e9-4a0d-9ac7-246d34af93b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.845710 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mplqm\" (UniqueName: \"kubernetes.io/projected/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-kube-api-access-mplqm\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.845742 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.845753 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e310e51-d5e9-4a0d-9ac7-246d34af93b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.935363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" event={"ID":"7e310e51-d5e9-4a0d-9ac7-246d34af93b5","Type":"ContainerDied","Data":"81c432283d62073174887c0b31f0ad20b2255cf969bdf49870854048a59d863b"} Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.935797 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c432283d62073174887c0b31f0ad20b2255cf969bdf49870854048a59d863b" Mar 19 19:26:54 crc kubenswrapper[5033]: I0319 19:26:54.935868 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.032814 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7"] Mar 19 19:26:55 crc kubenswrapper[5033]: E0319 19:26:55.033215 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e310e51-d5e9-4a0d-9ac7-246d34af93b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.033233 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e310e51-d5e9-4a0d-9ac7-246d34af93b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:55 crc kubenswrapper[5033]: E0319 19:26:55.033256 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" containerName="oc" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.033261 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" containerName="oc" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.033437 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e310e51-d5e9-4a0d-9ac7-246d34af93b5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.033474 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" containerName="oc" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.034204 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.037241 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.038186 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.042199 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.044836 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.046126 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7"] Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.154773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.154856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698p7\" (UniqueName: \"kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.155053 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: E0319 19:26:55.158140 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e310e51_d5e9_4a0d_9ac7_246d34af93b5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e310e51_d5e9_4a0d_9ac7_246d34af93b5.slice/crio-81c432283d62073174887c0b31f0ad20b2255cf969bdf49870854048a59d863b\": RecentStats: unable to find data in memory cache]" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.256821 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.256925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.256966 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698p7\" (UniqueName: \"kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.261861 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.271813 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.273516 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698p7\" (UniqueName: \"kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-khxk7\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.365276 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.915069 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7"] Mar 19 19:26:55 crc kubenswrapper[5033]: I0319 19:26:55.945426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" event={"ID":"2323e870-76a4-4404-aed7-0c40ee7dd4d2","Type":"ContainerStarted","Data":"8af5a43aee89ff9caf650199b31279df3c256c616c2922541a6a87fa75a2461e"} Mar 19 19:26:56 crc kubenswrapper[5033]: I0319 19:26:56.953412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" event={"ID":"2323e870-76a4-4404-aed7-0c40ee7dd4d2","Type":"ContainerStarted","Data":"e6aa86f6829bddc3686638630f25591b814668d985fc3bbb174b158805fe08a7"} Mar 19 19:26:56 crc kubenswrapper[5033]: I0319 19:26:56.971068 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" podStartSLOduration=1.50646014 podStartE2EDuration="1.971044691s" podCreationTimestamp="2026-03-19 19:26:55 +0000 UTC" firstStartedPulling="2026-03-19 19:26:55.920583093 +0000 UTC m=+1826.025612942" lastFinishedPulling="2026-03-19 19:26:56.385167634 +0000 UTC m=+1826.490197493" observedRunningTime="2026-03-19 19:26:56.970145816 +0000 UTC m=+1827.075175665" watchObservedRunningTime="2026-03-19 19:26:56.971044691 +0000 UTC m=+1827.076074550" Mar 19 19:27:00 crc kubenswrapper[5033]: I0319 19:27:00.628846 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:27:00 crc kubenswrapper[5033]: E0319 19:27:00.629812 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:27:02 crc kubenswrapper[5033]: I0319 19:27:02.004953 5033 generic.go:334] "Generic (PLEG): container finished" podID="2323e870-76a4-4404-aed7-0c40ee7dd4d2" containerID="e6aa86f6829bddc3686638630f25591b814668d985fc3bbb174b158805fe08a7" exitCode=0 Mar 19 19:27:02 crc kubenswrapper[5033]: I0319 19:27:02.005033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" event={"ID":"2323e870-76a4-4404-aed7-0c40ee7dd4d2","Type":"ContainerDied","Data":"e6aa86f6829bddc3686638630f25591b814668d985fc3bbb174b158805fe08a7"} Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.763835 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.842582 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698p7\" (UniqueName: \"kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7\") pod \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.842627 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam\") pod \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.842646 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory\") pod \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\" (UID: \"2323e870-76a4-4404-aed7-0c40ee7dd4d2\") " Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.856879 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7" (OuterVolumeSpecName: "kube-api-access-698p7") pod "2323e870-76a4-4404-aed7-0c40ee7dd4d2" (UID: "2323e870-76a4-4404-aed7-0c40ee7dd4d2"). InnerVolumeSpecName "kube-api-access-698p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.874638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory" (OuterVolumeSpecName: "inventory") pod "2323e870-76a4-4404-aed7-0c40ee7dd4d2" (UID: "2323e870-76a4-4404-aed7-0c40ee7dd4d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.893072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2323e870-76a4-4404-aed7-0c40ee7dd4d2" (UID: "2323e870-76a4-4404-aed7-0c40ee7dd4d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.945063 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698p7\" (UniqueName: \"kubernetes.io/projected/2323e870-76a4-4404-aed7-0c40ee7dd4d2-kube-api-access-698p7\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.945110 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:03 crc kubenswrapper[5033]: I0319 19:27:03.945124 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2323e870-76a4-4404-aed7-0c40ee7dd4d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.027931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" event={"ID":"2323e870-76a4-4404-aed7-0c40ee7dd4d2","Type":"ContainerDied","Data":"8af5a43aee89ff9caf650199b31279df3c256c616c2922541a6a87fa75a2461e"} Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.027968 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-khxk7" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.027975 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8af5a43aee89ff9caf650199b31279df3c256c616c2922541a6a87fa75a2461e" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.103840 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b"] Mar 19 19:27:04 crc kubenswrapper[5033]: E0319 19:27:04.104324 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2323e870-76a4-4404-aed7-0c40ee7dd4d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.104342 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2323e870-76a4-4404-aed7-0c40ee7dd4d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.104531 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2323e870-76a4-4404-aed7-0c40ee7dd4d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.105339 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.114838 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.115125 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.115252 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.115564 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.115881 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b"] Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.161772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ncb\" (UniqueName: \"kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.162269 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.162568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.265019 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ncb\" (UniqueName: \"kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.265101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.265238 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.274924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.278780 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.293130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ncb\" (UniqueName: \"kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bf66b\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:04 crc kubenswrapper[5033]: I0319 19:27:04.424103 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:05 crc kubenswrapper[5033]: I0319 19:27:05.027591 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b"] Mar 19 19:27:05 crc kubenswrapper[5033]: I0319 19:27:05.036791 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" event={"ID":"047e163c-b9d5-4426-9810-3cdf70a856a4","Type":"ContainerStarted","Data":"0a67ddb3b3cb7f3b7358d17c5e4de49b37e58f604dd65b601e011ab7926baf66"} Mar 19 19:27:06 crc kubenswrapper[5033]: I0319 19:27:06.052091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" event={"ID":"047e163c-b9d5-4426-9810-3cdf70a856a4","Type":"ContainerStarted","Data":"597a1d7093a3687b98c22ade543ff8fb677b62113650af5f853f040f7a725c43"} Mar 19 19:27:06 crc kubenswrapper[5033]: I0319 19:27:06.088711 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" podStartSLOduration=1.6354586850000001 podStartE2EDuration="2.088683895s" podCreationTimestamp="2026-03-19 19:27:04 +0000 UTC" firstStartedPulling="2026-03-19 19:27:05.024831149 +0000 UTC m=+1835.129860998" lastFinishedPulling="2026-03-19 19:27:05.478056349 +0000 UTC m=+1835.583086208" observedRunningTime="2026-03-19 19:27:06.072129658 +0000 UTC m=+1836.177159517" watchObservedRunningTime="2026-03-19 19:27:06.088683895 +0000 UTC m=+1836.193713774" Mar 19 19:27:11 crc kubenswrapper[5033]: I0319 19:27:11.620351 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:27:11 crc kubenswrapper[5033]: E0319 19:27:11.622657 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:27:14 crc kubenswrapper[5033]: I0319 19:27:14.695154 5033 scope.go:117] "RemoveContainer" containerID="55d3e136266e06a108fd7dc086a680131e07976c1a126cae7b34a40d5deb59eb" Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.050194 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nll7n"] Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.061251 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2z6h7"] Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.069787 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nll7n"] Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.079242 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2z6h7"] Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.632026 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f7fa82-563b-4906-8e47-0ef910e3993a" path="/var/lib/kubelet/pods/14f7fa82-563b-4906-8e47-0ef910e3993a/volumes" Mar 19 19:27:20 crc kubenswrapper[5033]: I0319 19:27:20.632645 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d670e98-83a9-43c0-b723-caf42651cfcc" path="/var/lib/kubelet/pods/6d670e98-83a9-43c0-b723-caf42651cfcc/volumes" Mar 19 19:27:21 crc kubenswrapper[5033]: I0319 19:27:21.035592 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8350-account-create-update-7bhxl"] Mar 19 19:27:21 crc kubenswrapper[5033]: I0319 19:27:21.045473 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8350-account-create-update-7bhxl"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.034303 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d5pz8"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.048924 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-08d0-account-create-update-5mrwk"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.064574 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9863-account-create-update-hwtsv"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.077185 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d5pz8"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.087330 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-08d0-account-create-update-5mrwk"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.100819 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9863-account-create-update-hwtsv"] Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.634698 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac10a03-94cf-460b-9d47-305d6ffaa16a" path="/var/lib/kubelet/pods/0ac10a03-94cf-460b-9d47-305d6ffaa16a/volumes" Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.635237 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f78ec8-661f-4f3b-88b6-cc687515ba76" path="/var/lib/kubelet/pods/49f78ec8-661f-4f3b-88b6-cc687515ba76/volumes" Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.635799 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83e6b50-dafa-4bb0-96d1-89ad756e947a" path="/var/lib/kubelet/pods/b83e6b50-dafa-4bb0-96d1-89ad756e947a/volumes" Mar 19 19:27:22 crc kubenswrapper[5033]: I0319 19:27:22.636322 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f469bfad-5e99-414b-b44c-18c5fde5a96b" path="/var/lib/kubelet/pods/f469bfad-5e99-414b-b44c-18c5fde5a96b/volumes" Mar 19 19:27:25 crc kubenswrapper[5033]: I0319 19:27:25.621167 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:27:25 crc kubenswrapper[5033]: E0319 19:27:25.621721 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:27:40 crc kubenswrapper[5033]: I0319 19:27:40.407742 5033 generic.go:334] "Generic (PLEG): container finished" podID="047e163c-b9d5-4426-9810-3cdf70a856a4" containerID="597a1d7093a3687b98c22ade543ff8fb677b62113650af5f853f040f7a725c43" exitCode=0 Mar 19 19:27:40 crc kubenswrapper[5033]: I0319 19:27:40.407828 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" event={"ID":"047e163c-b9d5-4426-9810-3cdf70a856a4","Type":"ContainerDied","Data":"597a1d7093a3687b98c22ade543ff8fb677b62113650af5f853f040f7a725c43"} Mar 19 19:27:40 crc kubenswrapper[5033]: I0319 19:27:40.628352 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:27:40 crc kubenswrapper[5033]: E0319 19:27:40.629366 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:27:41 crc kubenswrapper[5033]: I0319 19:27:41.921114 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.086536 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam\") pod \"047e163c-b9d5-4426-9810-3cdf70a856a4\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.086649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9ncb\" (UniqueName: \"kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb\") pod \"047e163c-b9d5-4426-9810-3cdf70a856a4\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.086803 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory\") pod \"047e163c-b9d5-4426-9810-3cdf70a856a4\" (UID: \"047e163c-b9d5-4426-9810-3cdf70a856a4\") " Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.092573 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb" (OuterVolumeSpecName: "kube-api-access-l9ncb") pod "047e163c-b9d5-4426-9810-3cdf70a856a4" (UID: "047e163c-b9d5-4426-9810-3cdf70a856a4"). InnerVolumeSpecName "kube-api-access-l9ncb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.116265 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory" (OuterVolumeSpecName: "inventory") pod "047e163c-b9d5-4426-9810-3cdf70a856a4" (UID: "047e163c-b9d5-4426-9810-3cdf70a856a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.117813 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "047e163c-b9d5-4426-9810-3cdf70a856a4" (UID: "047e163c-b9d5-4426-9810-3cdf70a856a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.189340 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9ncb\" (UniqueName: \"kubernetes.io/projected/047e163c-b9d5-4426-9810-3cdf70a856a4-kube-api-access-l9ncb\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.189373 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.189383 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047e163c-b9d5-4426-9810-3cdf70a856a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.430297 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" event={"ID":"047e163c-b9d5-4426-9810-3cdf70a856a4","Type":"ContainerDied","Data":"0a67ddb3b3cb7f3b7358d17c5e4de49b37e58f604dd65b601e011ab7926baf66"} Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.430334 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bf66b" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.430344 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a67ddb3b3cb7f3b7358d17c5e4de49b37e58f604dd65b601e011ab7926baf66" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.529645 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx"] Mar 19 19:27:42 crc kubenswrapper[5033]: E0319 19:27:42.530212 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047e163c-b9d5-4426-9810-3cdf70a856a4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.530242 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="047e163c-b9d5-4426-9810-3cdf70a856a4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.530630 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="047e163c-b9d5-4426-9810-3cdf70a856a4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.532302 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.534846 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.535072 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.535509 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.535551 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.554564 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx"] Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.700546 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.701262 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs42k\" (UniqueName: \"kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.701405 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.803339 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.803463 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs42k\" (UniqueName: \"kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.803515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.808086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.808852 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.830619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs42k\" (UniqueName: \"kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:42 crc kubenswrapper[5033]: I0319 19:27:42.850701 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:27:43 crc kubenswrapper[5033]: I0319 19:27:43.427930 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx"] Mar 19 19:27:43 crc kubenswrapper[5033]: I0319 19:27:43.443220 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" event={"ID":"8f4b71de-a42d-4793-a035-a4728876706e","Type":"ContainerStarted","Data":"536b0c04e013505fe076d26826a38162eca0d14eb7731e87d97baca25db336de"} Mar 19 19:27:44 crc kubenswrapper[5033]: I0319 19:27:44.456674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" event={"ID":"8f4b71de-a42d-4793-a035-a4728876706e","Type":"ContainerStarted","Data":"ee6337e55526931537160e14ebd01c7f86c1fda8095eb9c431202a751ae16eac"} Mar 19 19:27:44 crc kubenswrapper[5033]: I0319 19:27:44.484869 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" podStartSLOduration=2.091435395 podStartE2EDuration="2.484848027s" podCreationTimestamp="2026-03-19 19:27:42 +0000 UTC" firstStartedPulling="2026-03-19 19:27:43.424355266 +0000 UTC m=+1873.529385105" lastFinishedPulling="2026-03-19 19:27:43.817767888 +0000 UTC m=+1873.922797737" observedRunningTime="2026-03-19 19:27:44.476743328 +0000 UTC m=+1874.581773177" watchObservedRunningTime="2026-03-19 19:27:44.484848027 +0000 UTC m=+1874.589877876" Mar 19 19:27:51 crc kubenswrapper[5033]: I0319 19:27:51.045497 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kpkmt"] Mar 19 19:27:51 crc kubenswrapper[5033]: I0319 19:27:51.053985 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kpkmt"] Mar 19 19:27:52 crc kubenswrapper[5033]: I0319 19:27:52.620909 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:27:52 crc kubenswrapper[5033]: E0319 19:27:52.622374 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:27:52 crc kubenswrapper[5033]: I0319 19:27:52.631384 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae335ad1-43b7-4f59-a78a-bfe88ed68cd7" path="/var/lib/kubelet/pods/ae335ad1-43b7-4f59-a78a-bfe88ed68cd7/volumes" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.140940 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565808-zlrn8"] Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.143632 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.147143 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.147228 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.147424 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.152094 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-zlrn8"] Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.180752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzwg\" (UniqueName: \"kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg\") pod \"auto-csr-approver-29565808-zlrn8\" (UID: \"048ce385-f25d-4fca-9508-7db919f7cb5f\") " pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.283432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzwg\" (UniqueName: \"kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg\") pod \"auto-csr-approver-29565808-zlrn8\" (UID: \"048ce385-f25d-4fca-9508-7db919f7cb5f\") " pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.308698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzwg\" (UniqueName: \"kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg\") pod \"auto-csr-approver-29565808-zlrn8\" (UID: \"048ce385-f25d-4fca-9508-7db919f7cb5f\") " pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.467049 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:00 crc kubenswrapper[5033]: I0319 19:28:00.947444 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-zlrn8"] Mar 19 19:28:00 crc kubenswrapper[5033]: W0319 19:28:00.951155 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod048ce385_f25d_4fca_9508_7db919f7cb5f.slice/crio-5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705 WatchSource:0}: Error finding container 5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705: Status 404 returned error can't find the container with id 5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705 Mar 19 19:28:01 crc kubenswrapper[5033]: I0319 19:28:01.641073 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" event={"ID":"048ce385-f25d-4fca-9508-7db919f7cb5f","Type":"ContainerStarted","Data":"5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705"} Mar 19 19:28:02 crc kubenswrapper[5033]: I0319 19:28:02.653338 5033 generic.go:334] "Generic (PLEG): container finished" podID="048ce385-f25d-4fca-9508-7db919f7cb5f" containerID="b6b2f3c156845ff9541d095075f5418ab1c766d5fa85178729e9cc43709f4524" exitCode=0 Mar 19 19:28:02 crc kubenswrapper[5033]: I0319 19:28:02.653709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" event={"ID":"048ce385-f25d-4fca-9508-7db919f7cb5f","Type":"ContainerDied","Data":"b6b2f3c156845ff9541d095075f5418ab1c766d5fa85178729e9cc43709f4524"} Mar 19 19:28:03 crc kubenswrapper[5033]: I0319 19:28:03.621021 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:28:03 crc kubenswrapper[5033]: E0319 19:28:03.621656 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.556357 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.668380 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhzwg\" (UniqueName: \"kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg\") pod \"048ce385-f25d-4fca-9508-7db919f7cb5f\" (UID: \"048ce385-f25d-4fca-9508-7db919f7cb5f\") " Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.675965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" event={"ID":"048ce385-f25d-4fca-9508-7db919f7cb5f","Type":"ContainerDied","Data":"5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705"} Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.676030 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5057bf3945190ad7d1c6d4e28ef9d588c6935fa634d2610e1a240adc84200705" Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.676106 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-zlrn8" Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.686235 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg" (OuterVolumeSpecName: "kube-api-access-mhzwg") pod "048ce385-f25d-4fca-9508-7db919f7cb5f" (UID: "048ce385-f25d-4fca-9508-7db919f7cb5f"). InnerVolumeSpecName "kube-api-access-mhzwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:04 crc kubenswrapper[5033]: I0319 19:28:04.771428 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhzwg\" (UniqueName: \"kubernetes.io/projected/048ce385-f25d-4fca-9508-7db919f7cb5f-kube-api-access-mhzwg\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:05 crc kubenswrapper[5033]: I0319 19:28:05.623107 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-mzjpj"] Mar 19 19:28:05 crc kubenswrapper[5033]: I0319 19:28:05.630622 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-mzjpj"] Mar 19 19:28:06 crc kubenswrapper[5033]: I0319 19:28:06.635263 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823a2735-37cf-45cc-9ed3-4c36ffe58b3d" path="/var/lib/kubelet/pods/823a2735-37cf-45cc-9ed3-4c36ffe58b3d/volumes" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.621200 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:28:14 crc kubenswrapper[5033]: E0319 19:28:14.622401 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.796400 5033 scope.go:117] "RemoveContainer" containerID="f9ab2001c84e79539289aa76af417be704cbeb56d549bdffae50841944edfb46" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.850544 5033 scope.go:117] "RemoveContainer" containerID="3cc6fba13cbb2ebb459b195296ab86e2695c67baf4026c237fcf55201ab24a8b" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.888591 5033 scope.go:117] "RemoveContainer" containerID="9747160d44c1d6cba806d0c6f3ba0073bf6fda85fb15e8ecae3042aa0cd43364" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.938132 5033 scope.go:117] "RemoveContainer" containerID="c8de08b05f8963e58049870bb4120136e569f9b34b6d5ec3743f988013cadf32" Mar 19 19:28:14 crc kubenswrapper[5033]: I0319 19:28:14.987746 5033 scope.go:117] "RemoveContainer" containerID="534cacdb4a77f3f82b88746dcdec6a48bed1e5408eab3027018d3455072baf70" Mar 19 19:28:15 crc kubenswrapper[5033]: I0319 19:28:15.038988 5033 scope.go:117] "RemoveContainer" containerID="361957c22d69df4bf08a7ff9336fe6772f71607f26c2fe4b9344eba8458ddf19" Mar 19 19:28:15 crc kubenswrapper[5033]: I0319 19:28:15.080949 5033 scope.go:117] "RemoveContainer" containerID="f796b21829d82e98c088eeaabacbafbe9b9bca08ba4d8b872e2a9ff2635d3e7a" Mar 19 19:28:15 crc kubenswrapper[5033]: I0319 19:28:15.098209 5033 scope.go:117] "RemoveContainer" containerID="4f6d5113c10a077e188734d798b0443e95e057d5c99bd69653518cb220db3329" Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.040174 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfb9t"] Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.051699 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wfb9t"] Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.062985 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-77vvw"] Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.073414 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-77vvw"] Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.631580 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdced5f-fb99-4010-b386-f944a01767df" path="/var/lib/kubelet/pods/3bdced5f-fb99-4010-b386-f944a01767df/volumes" Mar 19 19:28:18 crc kubenswrapper[5033]: I0319 19:28:18.632438 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97baf414-f658-4c71-83a5-a0ecd6d09e90" path="/var/lib/kubelet/pods/97baf414-f658-4c71-83a5-a0ecd6d09e90/volumes" Mar 19 19:28:29 crc kubenswrapper[5033]: I0319 19:28:29.620444 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:28:29 crc kubenswrapper[5033]: E0319 19:28:29.621224 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:28:29 crc kubenswrapper[5033]: I0319 19:28:29.913269 5033 generic.go:334] "Generic (PLEG): container finished" podID="8f4b71de-a42d-4793-a035-a4728876706e" containerID="ee6337e55526931537160e14ebd01c7f86c1fda8095eb9c431202a751ae16eac" exitCode=0 Mar 19 19:28:29 crc kubenswrapper[5033]: I0319 19:28:29.913406 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" event={"ID":"8f4b71de-a42d-4793-a035-a4728876706e","Type":"ContainerDied","Data":"ee6337e55526931537160e14ebd01c7f86c1fda8095eb9c431202a751ae16eac"} Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.426474 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.617637 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory\") pod \"8f4b71de-a42d-4793-a035-a4728876706e\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.618242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs42k\" (UniqueName: \"kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k\") pod \"8f4b71de-a42d-4793-a035-a4728876706e\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.618287 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam\") pod \"8f4b71de-a42d-4793-a035-a4728876706e\" (UID: \"8f4b71de-a42d-4793-a035-a4728876706e\") " Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.623021 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k" (OuterVolumeSpecName: "kube-api-access-zs42k") pod "8f4b71de-a42d-4793-a035-a4728876706e" (UID: "8f4b71de-a42d-4793-a035-a4728876706e"). InnerVolumeSpecName "kube-api-access-zs42k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.646252 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory" (OuterVolumeSpecName: "inventory") pod "8f4b71de-a42d-4793-a035-a4728876706e" (UID: "8f4b71de-a42d-4793-a035-a4728876706e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.664646 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8f4b71de-a42d-4793-a035-a4728876706e" (UID: "8f4b71de-a42d-4793-a035-a4728876706e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.721283 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs42k\" (UniqueName: \"kubernetes.io/projected/8f4b71de-a42d-4793-a035-a4728876706e-kube-api-access-zs42k\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.721322 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.721348 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4b71de-a42d-4793-a035-a4728876706e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.944149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" event={"ID":"8f4b71de-a42d-4793-a035-a4728876706e","Type":"ContainerDied","Data":"536b0c04e013505fe076d26826a38162eca0d14eb7731e87d97baca25db336de"} Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.944188 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="536b0c04e013505fe076d26826a38162eca0d14eb7731e87d97baca25db336de" Mar 19 19:28:31 crc kubenswrapper[5033]: I0319 19:28:31.944254 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.035134 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4jwkh"] Mar 19 19:28:32 crc kubenswrapper[5033]: E0319 19:28:32.035584 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048ce385-f25d-4fca-9508-7db919f7cb5f" containerName="oc" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.035603 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="048ce385-f25d-4fca-9508-7db919f7cb5f" containerName="oc" Mar 19 19:28:32 crc kubenswrapper[5033]: E0319 19:28:32.035619 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4b71de-a42d-4793-a035-a4728876706e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.035627 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4b71de-a42d-4793-a035-a4728876706e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.035831 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4b71de-a42d-4793-a035-a4728876706e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.035844 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="048ce385-f25d-4fca-9508-7db919f7cb5f" containerName="oc" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.036555 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.041494 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.044430 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.044488 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.044687 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.060773 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4jwkh"] Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.128408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc4tf\" (UniqueName: \"kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.128568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.128605 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.231267 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc4tf\" (UniqueName: \"kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.231393 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.231426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.236944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.244222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.248281 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc4tf\" (UniqueName: \"kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf\") pod \"ssh-known-hosts-edpm-deployment-4jwkh\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.353020 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.902487 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4jwkh"] Mar 19 19:28:32 crc kubenswrapper[5033]: W0319 19:28:32.907833 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24a4d98_fb0f_4bcd_905f_4bfce712a950.slice/crio-bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3 WatchSource:0}: Error finding container bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3: Status 404 returned error can't find the container with id bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3 Mar 19 19:28:32 crc kubenswrapper[5033]: I0319 19:28:32.953700 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" event={"ID":"f24a4d98-fb0f-4bcd-905f-4bfce712a950","Type":"ContainerStarted","Data":"bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3"} Mar 19 19:28:33 crc kubenswrapper[5033]: I0319 19:28:33.967463 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" event={"ID":"f24a4d98-fb0f-4bcd-905f-4bfce712a950","Type":"ContainerStarted","Data":"1719b1b42259e1d9b2980866df3a29d568ff27cee2976b3333fc48f711ac900b"} Mar 19 19:28:33 crc kubenswrapper[5033]: I0319 19:28:33.997957 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" podStartSLOduration=1.476480101 podStartE2EDuration="1.997935399s" podCreationTimestamp="2026-03-19 19:28:32 +0000 UTC" firstStartedPulling="2026-03-19 19:28:32.912163844 +0000 UTC m=+1923.017193693" lastFinishedPulling="2026-03-19 19:28:33.433619142 +0000 UTC m=+1923.538648991" observedRunningTime="2026-03-19 19:28:33.985520498 +0000 UTC m=+1924.090550377" watchObservedRunningTime="2026-03-19 19:28:33.997935399 +0000 UTC m=+1924.102965248" Mar 19 19:28:40 crc kubenswrapper[5033]: I0319 19:28:40.627355 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:28:40 crc kubenswrapper[5033]: E0319 19:28:40.628109 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:28:41 crc kubenswrapper[5033]: I0319 19:28:41.033315 5033 generic.go:334] "Generic (PLEG): container finished" podID="f24a4d98-fb0f-4bcd-905f-4bfce712a950" containerID="1719b1b42259e1d9b2980866df3a29d568ff27cee2976b3333fc48f711ac900b" exitCode=0 Mar 19 19:28:41 crc kubenswrapper[5033]: I0319 19:28:41.033363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" event={"ID":"f24a4d98-fb0f-4bcd-905f-4bfce712a950","Type":"ContainerDied","Data":"1719b1b42259e1d9b2980866df3a29d568ff27cee2976b3333fc48f711ac900b"} Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.575301 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.769350 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc4tf\" (UniqueName: \"kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf\") pod \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.769603 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0\") pod \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.769645 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam\") pod \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\" (UID: \"f24a4d98-fb0f-4bcd-905f-4bfce712a950\") " Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.775107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf" (OuterVolumeSpecName: "kube-api-access-sc4tf") pod "f24a4d98-fb0f-4bcd-905f-4bfce712a950" (UID: "f24a4d98-fb0f-4bcd-905f-4bfce712a950"). InnerVolumeSpecName "kube-api-access-sc4tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.804567 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f24a4d98-fb0f-4bcd-905f-4bfce712a950" (UID: "f24a4d98-fb0f-4bcd-905f-4bfce712a950"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.810407 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f24a4d98-fb0f-4bcd-905f-4bfce712a950" (UID: "f24a4d98-fb0f-4bcd-905f-4bfce712a950"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.872700 5033 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.872929 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f24a4d98-fb0f-4bcd-905f-4bfce712a950-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:42 crc kubenswrapper[5033]: I0319 19:28:42.872993 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc4tf\" (UniqueName: \"kubernetes.io/projected/f24a4d98-fb0f-4bcd-905f-4bfce712a950-kube-api-access-sc4tf\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.048621 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" event={"ID":"f24a4d98-fb0f-4bcd-905f-4bfce712a950","Type":"ContainerDied","Data":"bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3"} Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.048661 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf58a9a5ee8b98c41fb3773768930e4e8f38b6b51de4c48130ee10f8d3dd2ed3" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.048692 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4jwkh" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.137382 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml"] Mar 19 19:28:43 crc kubenswrapper[5033]: E0319 19:28:43.137902 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24a4d98-fb0f-4bcd-905f-4bfce712a950" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.137919 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24a4d98-fb0f-4bcd-905f-4bfce712a950" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.138173 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24a4d98-fb0f-4bcd-905f-4bfce712a950" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.139047 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.142120 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.142300 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.143015 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.144095 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.149329 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml"] Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.180215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44qc\" (UniqueName: \"kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.180285 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.180343 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.282748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.282924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44qc\" (UniqueName: \"kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.282968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.286559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.289626 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.298165 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44qc\" (UniqueName: \"kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f2rml\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:43 crc kubenswrapper[5033]: I0319 19:28:43.459531 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:44 crc kubenswrapper[5033]: I0319 19:28:44.033043 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml"] Mar 19 19:28:44 crc kubenswrapper[5033]: I0319 19:28:44.061854 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" event={"ID":"94576899-d90e-4b04-ae41-06c47f2a4383","Type":"ContainerStarted","Data":"420f0dc56b5870fa14e565034e8c3f460012cf6d0369ac88938222f21ac919b1"} Mar 19 19:28:45 crc kubenswrapper[5033]: I0319 19:28:45.072318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" event={"ID":"94576899-d90e-4b04-ae41-06c47f2a4383","Type":"ContainerStarted","Data":"9a6350794c7461ce0c51c1ccdd4341d5adc0de58e3a55429dd5b7494de1c18d0"} Mar 19 19:28:45 crc kubenswrapper[5033]: I0319 19:28:45.088819 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" podStartSLOduration=1.522091358 podStartE2EDuration="2.088803243s" podCreationTimestamp="2026-03-19 19:28:43 +0000 UTC" firstStartedPulling="2026-03-19 19:28:44.031970626 +0000 UTC m=+1934.137000505" lastFinishedPulling="2026-03-19 19:28:44.598682541 +0000 UTC m=+1934.703712390" observedRunningTime="2026-03-19 19:28:45.087372652 +0000 UTC m=+1935.192402501" watchObservedRunningTime="2026-03-19 19:28:45.088803243 +0000 UTC m=+1935.193833092" Mar 19 19:28:53 crc kubenswrapper[5033]: I0319 19:28:53.140808 5033 generic.go:334] "Generic (PLEG): container finished" podID="94576899-d90e-4b04-ae41-06c47f2a4383" containerID="9a6350794c7461ce0c51c1ccdd4341d5adc0de58e3a55429dd5b7494de1c18d0" exitCode=0 Mar 19 19:28:53 crc kubenswrapper[5033]: I0319 19:28:53.140910 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" event={"ID":"94576899-d90e-4b04-ae41-06c47f2a4383","Type":"ContainerDied","Data":"9a6350794c7461ce0c51c1ccdd4341d5adc0de58e3a55429dd5b7494de1c18d0"} Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.621634 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:28:54 crc kubenswrapper[5033]: E0319 19:28:54.622252 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.713921 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.824348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory\") pod \"94576899-d90e-4b04-ae41-06c47f2a4383\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.824481 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam\") pod \"94576899-d90e-4b04-ae41-06c47f2a4383\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.824565 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d44qc\" (UniqueName: \"kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc\") pod \"94576899-d90e-4b04-ae41-06c47f2a4383\" (UID: \"94576899-d90e-4b04-ae41-06c47f2a4383\") " Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.830769 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc" (OuterVolumeSpecName: "kube-api-access-d44qc") pod "94576899-d90e-4b04-ae41-06c47f2a4383" (UID: "94576899-d90e-4b04-ae41-06c47f2a4383"). InnerVolumeSpecName "kube-api-access-d44qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.852506 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "94576899-d90e-4b04-ae41-06c47f2a4383" (UID: "94576899-d90e-4b04-ae41-06c47f2a4383"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.852977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory" (OuterVolumeSpecName: "inventory") pod "94576899-d90e-4b04-ae41-06c47f2a4383" (UID: "94576899-d90e-4b04-ae41-06c47f2a4383"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.926643 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.926677 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94576899-d90e-4b04-ae41-06c47f2a4383-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:54 crc kubenswrapper[5033]: I0319 19:28:54.926690 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d44qc\" (UniqueName: \"kubernetes.io/projected/94576899-d90e-4b04-ae41-06c47f2a4383-kube-api-access-d44qc\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.161996 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" event={"ID":"94576899-d90e-4b04-ae41-06c47f2a4383","Type":"ContainerDied","Data":"420f0dc56b5870fa14e565034e8c3f460012cf6d0369ac88938222f21ac919b1"} Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.162271 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420f0dc56b5870fa14e565034e8c3f460012cf6d0369ac88938222f21ac919b1" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.162067 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f2rml" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.235131 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7"] Mar 19 19:28:55 crc kubenswrapper[5033]: E0319 19:28:55.236073 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94576899-d90e-4b04-ae41-06c47f2a4383" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.236093 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94576899-d90e-4b04-ae41-06c47f2a4383" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.236554 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="94576899-d90e-4b04-ae41-06c47f2a4383" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.237698 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.242871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.243378 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.243546 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.246366 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.280137 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7"] Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.344039 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtw87\" (UniqueName: \"kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.344115 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.344257 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.445561 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtw87\" (UniqueName: \"kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.445687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.445768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.450966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.454610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.463398 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtw87\" (UniqueName: \"kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:55 crc kubenswrapper[5033]: I0319 19:28:55.561761 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:28:56 crc kubenswrapper[5033]: I0319 19:28:56.125137 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7"] Mar 19 19:28:56 crc kubenswrapper[5033]: I0319 19:28:56.177601 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" event={"ID":"5db4f1c0-87be-4f27-8a76-6f2a3deb4158","Type":"ContainerStarted","Data":"a48b48b99e85ee2d0f9cc1a2cde9c602ae4776a1bd3815e7477f8497d4322e8c"} Mar 19 19:28:57 crc kubenswrapper[5033]: I0319 19:28:57.189267 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" event={"ID":"5db4f1c0-87be-4f27-8a76-6f2a3deb4158","Type":"ContainerStarted","Data":"addc889a2b7df66989d4f58a51513343a98a72cbdcbf5388d3c5c6a057fd92e6"} Mar 19 19:28:57 crc kubenswrapper[5033]: I0319 19:28:57.204730 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" podStartSLOduration=1.751501344 podStartE2EDuration="2.204709018s" podCreationTimestamp="2026-03-19 19:28:55 +0000 UTC" firstStartedPulling="2026-03-19 19:28:56.118013596 +0000 UTC m=+1946.223043445" lastFinishedPulling="2026-03-19 19:28:56.57122127 +0000 UTC m=+1946.676251119" observedRunningTime="2026-03-19 19:28:57.201091067 +0000 UTC m=+1947.306120916" watchObservedRunningTime="2026-03-19 19:28:57.204709018 +0000 UTC m=+1947.309738877" Mar 19 19:29:03 crc kubenswrapper[5033]: I0319 19:29:03.051850 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jrt6w"] Mar 19 19:29:03 crc kubenswrapper[5033]: I0319 19:29:03.062957 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jrt6w"] Mar 19 19:29:04 crc kubenswrapper[5033]: I0319 19:29:04.631850 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451a5336-669f-41f5-aeb9-0c9db1ba5557" path="/var/lib/kubelet/pods/451a5336-669f-41f5-aeb9-0c9db1ba5557/volumes" Mar 19 19:29:06 crc kubenswrapper[5033]: I0319 19:29:06.277152 5033 generic.go:334] "Generic (PLEG): container finished" podID="5db4f1c0-87be-4f27-8a76-6f2a3deb4158" containerID="addc889a2b7df66989d4f58a51513343a98a72cbdcbf5388d3c5c6a057fd92e6" exitCode=0 Mar 19 19:29:06 crc kubenswrapper[5033]: I0319 19:29:06.277243 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" event={"ID":"5db4f1c0-87be-4f27-8a76-6f2a3deb4158","Type":"ContainerDied","Data":"addc889a2b7df66989d4f58a51513343a98a72cbdcbf5388d3c5c6a057fd92e6"} Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.620433 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:29:07 crc kubenswrapper[5033]: E0319 19:29:07.621011 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.733218 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.810205 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory\") pod \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.810610 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam\") pod \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.810658 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtw87\" (UniqueName: \"kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87\") pod \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\" (UID: \"5db4f1c0-87be-4f27-8a76-6f2a3deb4158\") " Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.815373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87" (OuterVolumeSpecName: "kube-api-access-dtw87") pod "5db4f1c0-87be-4f27-8a76-6f2a3deb4158" (UID: "5db4f1c0-87be-4f27-8a76-6f2a3deb4158"). InnerVolumeSpecName "kube-api-access-dtw87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.837346 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5db4f1c0-87be-4f27-8a76-6f2a3deb4158" (UID: "5db4f1c0-87be-4f27-8a76-6f2a3deb4158"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.838173 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory" (OuterVolumeSpecName: "inventory") pod "5db4f1c0-87be-4f27-8a76-6f2a3deb4158" (UID: "5db4f1c0-87be-4f27-8a76-6f2a3deb4158"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.913588 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.913626 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtw87\" (UniqueName: \"kubernetes.io/projected/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-kube-api-access-dtw87\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:07 crc kubenswrapper[5033]: I0319 19:29:07.913642 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5db4f1c0-87be-4f27-8a76-6f2a3deb4158-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.294186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" event={"ID":"5db4f1c0-87be-4f27-8a76-6f2a3deb4158","Type":"ContainerDied","Data":"a48b48b99e85ee2d0f9cc1a2cde9c602ae4776a1bd3815e7477f8497d4322e8c"} Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.294440 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48b48b99e85ee2d0f9cc1a2cde9c602ae4776a1bd3815e7477f8497d4322e8c" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.294249 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.449071 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn"] Mar 19 19:29:08 crc kubenswrapper[5033]: E0319 19:29:08.450932 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db4f1c0-87be-4f27-8a76-6f2a3deb4158" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.450975 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db4f1c0-87be-4f27-8a76-6f2a3deb4158" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.452093 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db4f1c0-87be-4f27-8a76-6f2a3deb4158" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.453380 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.455955 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.456194 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.456227 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.456372 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.465890 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.466297 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.466806 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.466989 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.480228 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn"] Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.627812 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.627867 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.627894 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.627913 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.627931 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628027 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628143 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4scq\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628186 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628235 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628250 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628305 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.628332 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.729905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.729950 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730262 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730317 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730342 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730387 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4scq\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.730439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.734555 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.734590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.734736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.735026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.735897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.736181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.736537 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.736558 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.736940 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.737216 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.737798 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.737945 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.750904 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.751759 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4scq\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:08 crc kubenswrapper[5033]: I0319 19:29:08.786105 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:09 crc kubenswrapper[5033]: I0319 19:29:09.327294 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn"] Mar 19 19:29:10 crc kubenswrapper[5033]: I0319 19:29:10.312088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" event={"ID":"bc8058af-3032-408e-b907-0eeed9d07109","Type":"ContainerStarted","Data":"d591eca714134c7ff9461d6d7d57291bfa24a71d0b67df45fe877bcfd397b19c"} Mar 19 19:29:10 crc kubenswrapper[5033]: I0319 19:29:10.312324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" event={"ID":"bc8058af-3032-408e-b907-0eeed9d07109","Type":"ContainerStarted","Data":"ebf1c9df50a150a602b9cd97ca7c687452de3fa5c2d8b28d9f66163d2e39471c"} Mar 19 19:29:10 crc kubenswrapper[5033]: I0319 19:29:10.340246 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" podStartSLOduration=1.9069349500000001 podStartE2EDuration="2.340223947s" podCreationTimestamp="2026-03-19 19:29:08 +0000 UTC" firstStartedPulling="2026-03-19 19:29:09.321639834 +0000 UTC m=+1959.426669683" lastFinishedPulling="2026-03-19 19:29:09.754928821 +0000 UTC m=+1959.859958680" observedRunningTime="2026-03-19 19:29:10.337102549 +0000 UTC m=+1960.442132408" watchObservedRunningTime="2026-03-19 19:29:10.340223947 +0000 UTC m=+1960.445253836" Mar 19 19:29:15 crc kubenswrapper[5033]: I0319 19:29:15.276291 5033 scope.go:117] "RemoveContainer" containerID="cdbe1c534ce22b52c8228d973137f9955dc6f9bbf447ba7ad1411b7d356bfccf" Mar 19 19:29:15 crc kubenswrapper[5033]: I0319 19:29:15.324196 5033 scope.go:117] "RemoveContainer" containerID="422717fffafbccb1ed80dd1b95be440f70bd0c84c74665fdee32247cc6b72f45" Mar 19 19:29:15 crc kubenswrapper[5033]: I0319 19:29:15.386834 5033 scope.go:117] "RemoveContainer" containerID="332084d4eace736cc48044ca5babb2d7fced82ca3b44c054162404b3a02ce78e" Mar 19 19:29:18 crc kubenswrapper[5033]: I0319 19:29:18.621058 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:29:18 crc kubenswrapper[5033]: E0319 19:29:18.621656 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:29:30 crc kubenswrapper[5033]: I0319 19:29:30.626595 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:29:30 crc kubenswrapper[5033]: E0319 19:29:30.628731 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:29:44 crc kubenswrapper[5033]: I0319 19:29:44.621298 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:29:44 crc kubenswrapper[5033]: E0319 19:29:44.622041 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:29:44 crc kubenswrapper[5033]: I0319 19:29:44.645871 5033 generic.go:334] "Generic (PLEG): container finished" podID="bc8058af-3032-408e-b907-0eeed9d07109" containerID="d591eca714134c7ff9461d6d7d57291bfa24a71d0b67df45fe877bcfd397b19c" exitCode=0 Mar 19 19:29:44 crc kubenswrapper[5033]: I0319 19:29:44.645913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" event={"ID":"bc8058af-3032-408e-b907-0eeed9d07109","Type":"ContainerDied","Data":"d591eca714134c7ff9461d6d7d57291bfa24a71d0b67df45fe877bcfd397b19c"} Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.228063 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.401711 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.401771 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.401994 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402156 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402232 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402401 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402562 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402666 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402866 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.402973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.403035 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4scq\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.403076 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle\") pod \"bc8058af-3032-408e-b907-0eeed9d07109\" (UID: \"bc8058af-3032-408e-b907-0eeed9d07109\") " Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.408733 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.408762 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.409096 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.409186 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.410499 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.411335 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.411402 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.412222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.414379 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq" (OuterVolumeSpecName: "kube-api-access-h4scq") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "kube-api-access-h4scq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.414921 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.417612 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.418735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.441968 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.444770 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory" (OuterVolumeSpecName: "inventory") pod "bc8058af-3032-408e-b907-0eeed9d07109" (UID: "bc8058af-3032-408e-b907-0eeed9d07109"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505678 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505934 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505949 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505966 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505978 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.505991 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506006 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506019 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506031 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4scq\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-kube-api-access-h4scq\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506043 5033 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506054 5033 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506065 5033 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506077 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bc8058af-3032-408e-b907-0eeed9d07109-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.506088 5033 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc8058af-3032-408e-b907-0eeed9d07109-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.666759 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" event={"ID":"bc8058af-3032-408e-b907-0eeed9d07109","Type":"ContainerDied","Data":"ebf1c9df50a150a602b9cd97ca7c687452de3fa5c2d8b28d9f66163d2e39471c"} Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.666801 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebf1c9df50a150a602b9cd97ca7c687452de3fa5c2d8b28d9f66163d2e39471c" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.666860 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.763090 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs"] Mar 19 19:29:46 crc kubenswrapper[5033]: E0319 19:29:46.763559 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc8058af-3032-408e-b907-0eeed9d07109" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.763584 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc8058af-3032-408e-b907-0eeed9d07109" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.763831 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc8058af-3032-408e-b907-0eeed9d07109" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.764920 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.771397 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.771625 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.771894 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.772140 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.772281 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.783402 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs"] Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.913668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.913786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.913841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.913992 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknb5\" (UniqueName: \"kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:46 crc kubenswrapper[5033]: I0319 19:29:46.914072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.016069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.016426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.016605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.016655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknb5\" (UniqueName: \"kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.016683 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.017504 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.020038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.020618 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.027573 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.033982 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknb5\" (UniqueName: \"kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-495rs\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.119769 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.673821 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs"] Mar 19 19:29:47 crc kubenswrapper[5033]: I0319 19:29:47.683588 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:29:48 crc kubenswrapper[5033]: I0319 19:29:48.686667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" event={"ID":"158eff83-6646-4490-8169-9d2c9c9cd06c","Type":"ContainerStarted","Data":"458b1c660bce9523ecfd2dda3814fb88af82b1d92b17f2119a1e5d3d5dc0b463"} Mar 19 19:29:48 crc kubenswrapper[5033]: I0319 19:29:48.686947 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" event={"ID":"158eff83-6646-4490-8169-9d2c9c9cd06c","Type":"ContainerStarted","Data":"97576954f302f99dcc48002b921d7844028a3e2eda932e919230c8147aca90d2"} Mar 19 19:29:48 crc kubenswrapper[5033]: I0319 19:29:48.703895 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" podStartSLOduration=2.282819856 podStartE2EDuration="2.703873109s" podCreationTimestamp="2026-03-19 19:29:46 +0000 UTC" firstStartedPulling="2026-03-19 19:29:47.683334921 +0000 UTC m=+1997.788364760" lastFinishedPulling="2026-03-19 19:29:48.104388164 +0000 UTC m=+1998.209418013" observedRunningTime="2026-03-19 19:29:48.700082343 +0000 UTC m=+1998.805112222" watchObservedRunningTime="2026-03-19 19:29:48.703873109 +0000 UTC m=+1998.808902978" Mar 19 19:29:54 crc kubenswrapper[5033]: I0319 19:29:54.996488 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:29:54 crc kubenswrapper[5033]: I0319 19:29:54.999958 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.010871 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.084163 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.084557 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.084732 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxst\" (UniqueName: \"kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.186563 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.186689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxst\" (UniqueName: \"kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.186847 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.187235 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.187327 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.210686 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxst\" (UniqueName: \"kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst\") pod \"redhat-operators-66wq2\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.321048 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:29:55 crc kubenswrapper[5033]: I0319 19:29:55.810073 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:29:56 crc kubenswrapper[5033]: I0319 19:29:56.766988 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerID="e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8" exitCode=0 Mar 19 19:29:56 crc kubenswrapper[5033]: I0319 19:29:56.767085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerDied","Data":"e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8"} Mar 19 19:29:56 crc kubenswrapper[5033]: I0319 19:29:56.767377 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerStarted","Data":"dbbb31d0c37d3c6883f2f994e4b49a111ae302697f22cb822b560b801834ca7d"} Mar 19 19:29:57 crc kubenswrapper[5033]: I0319 19:29:57.780013 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerStarted","Data":"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee"} Mar 19 19:29:59 crc kubenswrapper[5033]: I0319 19:29:59.622042 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:29:59 crc kubenswrapper[5033]: E0319 19:29:59.622614 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.140376 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565810-tgflk"] Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.141724 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.145572 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.147259 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.147275 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.157888 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb"] Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.161633 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.164297 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.164402 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.176059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-tgflk"] Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.187074 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb"] Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.308333 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wclwg\" (UniqueName: \"kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg\") pod \"auto-csr-approver-29565810-tgflk\" (UID: \"95869176-9b34-468d-9078-0bf13e82b316\") " pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.308376 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvm2r\" (UniqueName: \"kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.308433 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.308733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.410519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.410613 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclwg\" (UniqueName: \"kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg\") pod \"auto-csr-approver-29565810-tgflk\" (UID: \"95869176-9b34-468d-9078-0bf13e82b316\") " pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.410638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvm2r\" (UniqueName: \"kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.410685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.411517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.416614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.425781 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wclwg\" (UniqueName: \"kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg\") pod \"auto-csr-approver-29565810-tgflk\" (UID: \"95869176-9b34-468d-9078-0bf13e82b316\") " pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.439932 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvm2r\" (UniqueName: \"kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r\") pod \"collect-profiles-29565810-ndjfb\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.459039 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.475649 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:00 crc kubenswrapper[5033]: I0319 19:30:00.952400 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-tgflk"] Mar 19 19:30:00 crc kubenswrapper[5033]: W0319 19:30:00.956026 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95869176_9b34_468d_9078_0bf13e82b316.slice/crio-6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9 WatchSource:0}: Error finding container 6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9: Status 404 returned error can't find the container with id 6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9 Mar 19 19:30:01 crc kubenswrapper[5033]: I0319 19:30:01.074643 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb"] Mar 19 19:30:01 crc kubenswrapper[5033]: W0319 19:30:01.081800 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d8a751_7b1a_4169_bbb2_a052502ec1af.slice/crio-320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12 WatchSource:0}: Error finding container 320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12: Status 404 returned error can't find the container with id 320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12 Mar 19 19:30:01 crc kubenswrapper[5033]: I0319 19:30:01.820696 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-tgflk" event={"ID":"95869176-9b34-468d-9078-0bf13e82b316","Type":"ContainerStarted","Data":"6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9"} Mar 19 19:30:01 crc kubenswrapper[5033]: I0319 19:30:01.822952 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" event={"ID":"f9d8a751-7b1a-4169-bbb2-a052502ec1af","Type":"ContainerStarted","Data":"d627501252ac9a6fd3b9365d09f4444295a6e8b033282ebebf81550165f17836"} Mar 19 19:30:01 crc kubenswrapper[5033]: I0319 19:30:01.822989 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" event={"ID":"f9d8a751-7b1a-4169-bbb2-a052502ec1af","Type":"ContainerStarted","Data":"320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12"} Mar 19 19:30:01 crc kubenswrapper[5033]: I0319 19:30:01.837165 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" podStartSLOduration=1.837147065 podStartE2EDuration="1.837147065s" podCreationTimestamp="2026-03-19 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:30:01.835166849 +0000 UTC m=+2011.940196698" watchObservedRunningTime="2026-03-19 19:30:01.837147065 +0000 UTC m=+2011.942176914" Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.840432 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerID="1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee" exitCode=0 Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.840877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerDied","Data":"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee"} Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.846661 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9d8a751-7b1a-4169-bbb2-a052502ec1af" containerID="d627501252ac9a6fd3b9365d09f4444295a6e8b033282ebebf81550165f17836" exitCode=0 Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.846721 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" event={"ID":"f9d8a751-7b1a-4169-bbb2-a052502ec1af","Type":"ContainerDied","Data":"d627501252ac9a6fd3b9365d09f4444295a6e8b033282ebebf81550165f17836"} Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.850408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-tgflk" event={"ID":"95869176-9b34-468d-9078-0bf13e82b316","Type":"ContainerStarted","Data":"517b4ccb2deb7f744bf4998d7aeecfc12e064191bf88431bcd33db8bebbb2fca"} Mar 19 19:30:02 crc kubenswrapper[5033]: I0319 19:30:02.908121 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565810-tgflk" podStartSLOduration=1.460186469 podStartE2EDuration="2.908099287s" podCreationTimestamp="2026-03-19 19:30:00 +0000 UTC" firstStartedPulling="2026-03-19 19:30:00.958619128 +0000 UTC m=+2011.063648977" lastFinishedPulling="2026-03-19 19:30:02.406531956 +0000 UTC m=+2012.511561795" observedRunningTime="2026-03-19 19:30:02.906025268 +0000 UTC m=+2013.011055127" watchObservedRunningTime="2026-03-19 19:30:02.908099287 +0000 UTC m=+2013.013129136" Mar 19 19:30:03 crc kubenswrapper[5033]: I0319 19:30:03.875749 5033 generic.go:334] "Generic (PLEG): container finished" podID="95869176-9b34-468d-9078-0bf13e82b316" containerID="517b4ccb2deb7f744bf4998d7aeecfc12e064191bf88431bcd33db8bebbb2fca" exitCode=0 Mar 19 19:30:03 crc kubenswrapper[5033]: I0319 19:30:03.875843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-tgflk" event={"ID":"95869176-9b34-468d-9078-0bf13e82b316","Type":"ContainerDied","Data":"517b4ccb2deb7f744bf4998d7aeecfc12e064191bf88431bcd33db8bebbb2fca"} Mar 19 19:30:03 crc kubenswrapper[5033]: I0319 19:30:03.879604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerStarted","Data":"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab"} Mar 19 19:30:03 crc kubenswrapper[5033]: I0319 19:30:03.921238 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-66wq2" podStartSLOduration=3.323493416 podStartE2EDuration="9.921214806s" podCreationTimestamp="2026-03-19 19:29:54 +0000 UTC" firstStartedPulling="2026-03-19 19:29:56.769508227 +0000 UTC m=+2006.874538086" lastFinishedPulling="2026-03-19 19:30:03.367229587 +0000 UTC m=+2013.472259476" observedRunningTime="2026-03-19 19:30:03.913573412 +0000 UTC m=+2014.018603271" watchObservedRunningTime="2026-03-19 19:30:03.921214806 +0000 UTC m=+2014.026244725" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.298977 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.398712 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume\") pod \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.399193 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume\") pod \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.400565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume" (OuterVolumeSpecName: "config-volume") pod "f9d8a751-7b1a-4169-bbb2-a052502ec1af" (UID: "f9d8a751-7b1a-4169-bbb2-a052502ec1af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.405625 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f9d8a751-7b1a-4169-bbb2-a052502ec1af" (UID: "f9d8a751-7b1a-4169-bbb2-a052502ec1af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.502221 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvm2r\" (UniqueName: \"kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r\") pod \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\" (UID: \"f9d8a751-7b1a-4169-bbb2-a052502ec1af\") " Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.503033 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9d8a751-7b1a-4169-bbb2-a052502ec1af-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.503062 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9d8a751-7b1a-4169-bbb2-a052502ec1af-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.506313 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r" (OuterVolumeSpecName: "kube-api-access-zvm2r") pod "f9d8a751-7b1a-4169-bbb2-a052502ec1af" (UID: "f9d8a751-7b1a-4169-bbb2-a052502ec1af"). InnerVolumeSpecName "kube-api-access-zvm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.607798 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvm2r\" (UniqueName: \"kubernetes.io/projected/f9d8a751-7b1a-4169-bbb2-a052502ec1af-kube-api-access-zvm2r\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.897645 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.897651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb" event={"ID":"f9d8a751-7b1a-4169-bbb2-a052502ec1af","Type":"ContainerDied","Data":"320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12"} Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.897733 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320f74918e4f03df7da9db9cd0b144dfd3690937813b448cf0a5599d42794c12" Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.933413 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd"] Mar 19 19:30:04 crc kubenswrapper[5033]: I0319 19:30:04.945143 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-jfjkd"] Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.232038 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.321218 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.321296 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.330716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wclwg\" (UniqueName: \"kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg\") pod \"95869176-9b34-468d-9078-0bf13e82b316\" (UID: \"95869176-9b34-468d-9078-0bf13e82b316\") " Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.334383 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg" (OuterVolumeSpecName: "kube-api-access-wclwg") pod "95869176-9b34-468d-9078-0bf13e82b316" (UID: "95869176-9b34-468d-9078-0bf13e82b316"). InnerVolumeSpecName "kube-api-access-wclwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.432475 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wclwg\" (UniqueName: \"kubernetes.io/projected/95869176-9b34-468d-9078-0bf13e82b316-kube-api-access-wclwg\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.912576 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-tgflk" event={"ID":"95869176-9b34-468d-9078-0bf13e82b316","Type":"ContainerDied","Data":"6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9"} Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.912619 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6667954ba49a3fe478bb0b96564ef7a5900cb1cbd8cf897996a362e61428bbd9" Mar 19 19:30:05 crc kubenswrapper[5033]: I0319 19:30:05.912631 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-tgflk" Mar 19 19:30:06 crc kubenswrapper[5033]: I0319 19:30:06.295915 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-47w5s"] Mar 19 19:30:06 crc kubenswrapper[5033]: I0319 19:30:06.308733 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-47w5s"] Mar 19 19:30:06 crc kubenswrapper[5033]: I0319 19:30:06.373228 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-66wq2" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="registry-server" probeResult="failure" output=< Mar 19 19:30:06 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:30:06 crc kubenswrapper[5033]: > Mar 19 19:30:06 crc kubenswrapper[5033]: I0319 19:30:06.633665 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56560b0d-1894-4ef0-a563-e3eefed012b8" path="/var/lib/kubelet/pods/56560b0d-1894-4ef0-a563-e3eefed012b8/volumes" Mar 19 19:30:06 crc kubenswrapper[5033]: I0319 19:30:06.635935 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73127891-1d5d-4371-87c0-82245ab12d5d" path="/var/lib/kubelet/pods/73127891-1d5d-4371-87c0-82245ab12d5d/volumes" Mar 19 19:30:10 crc kubenswrapper[5033]: I0319 19:30:10.627405 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:30:10 crc kubenswrapper[5033]: E0319 19:30:10.628120 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:30:15 crc kubenswrapper[5033]: I0319 19:30:15.400848 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:15 crc kubenswrapper[5033]: I0319 19:30:15.461621 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:15 crc kubenswrapper[5033]: I0319 19:30:15.506247 5033 scope.go:117] "RemoveContainer" containerID="9435823276ba60fe0d014e94bdaa2429445483e766bcb33e12f00030a31c1f7a" Mar 19 19:30:15 crc kubenswrapper[5033]: I0319 19:30:15.584633 5033 scope.go:117] "RemoveContainer" containerID="d3c47cf3115467b6551c8ae8875310d6b60d9e4a0b4458cbfca0f09e4cadc039" Mar 19 19:30:15 crc kubenswrapper[5033]: I0319 19:30:15.645269 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.013283 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-66wq2" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="registry-server" containerID="cri-o://17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab" gracePeriod=2 Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.543908 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.680532 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content\") pod \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.680598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxst\" (UniqueName: \"kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst\") pod \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.680831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities\") pod \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\" (UID: \"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f\") " Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.681651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities" (OuterVolumeSpecName: "utilities") pod "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" (UID: "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.707648 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst" (OuterVolumeSpecName: "kube-api-access-hgxst") pod "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" (UID: "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f"). InnerVolumeSpecName "kube-api-access-hgxst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.782838 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxst\" (UniqueName: \"kubernetes.io/projected/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-kube-api-access-hgxst\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.782871 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.857234 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" (UID: "f9dc3f00-6d8e-4cc6-8894-70c717f67d1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:30:17 crc kubenswrapper[5033]: I0319 19:30:17.884292 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.024134 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerID="17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab" exitCode=0 Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.024194 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66wq2" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.024215 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerDied","Data":"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab"} Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.025206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66wq2" event={"ID":"f9dc3f00-6d8e-4cc6-8894-70c717f67d1f","Type":"ContainerDied","Data":"dbbb31d0c37d3c6883f2f994e4b49a111ae302697f22cb822b560b801834ca7d"} Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.025236 5033 scope.go:117] "RemoveContainer" containerID="17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.050998 5033 scope.go:117] "RemoveContainer" containerID="1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.062508 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.071137 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-66wq2"] Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.079247 5033 scope.go:117] "RemoveContainer" containerID="e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.122714 5033 scope.go:117] "RemoveContainer" containerID="17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab" Mar 19 19:30:18 crc kubenswrapper[5033]: E0319 19:30:18.123144 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab\": container with ID starting with 17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab not found: ID does not exist" containerID="17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.123183 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab"} err="failed to get container status \"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab\": rpc error: code = NotFound desc = could not find container \"17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab\": container with ID starting with 17e37ce174208d2030fad77ec134fe095617a5f0258ca65fd0451d09f9422eab not found: ID does not exist" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.123210 5033 scope.go:117] "RemoveContainer" containerID="1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee" Mar 19 19:30:18 crc kubenswrapper[5033]: E0319 19:30:18.123476 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee\": container with ID starting with 1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee not found: ID does not exist" containerID="1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.123509 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee"} err="failed to get container status \"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee\": rpc error: code = NotFound desc = could not find container \"1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee\": container with ID starting with 1de8c4af60dccf9832bd77f075ff50abf7b44408980440a2298bbd4d8fbabaee not found: ID does not exist" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.123530 5033 scope.go:117] "RemoveContainer" containerID="e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8" Mar 19 19:30:18 crc kubenswrapper[5033]: E0319 19:30:18.123754 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8\": container with ID starting with e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8 not found: ID does not exist" containerID="e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.123776 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8"} err="failed to get container status \"e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8\": rpc error: code = NotFound desc = could not find container \"e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8\": container with ID starting with e7401b521c19d743786e949c56065a8116f4269a3737f44a8944510cecc0fbe8 not found: ID does not exist" Mar 19 19:30:18 crc kubenswrapper[5033]: I0319 19:30:18.643832 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" path="/var/lib/kubelet/pods/f9dc3f00-6d8e-4cc6-8894-70c717f67d1f/volumes" Mar 19 19:30:21 crc kubenswrapper[5033]: I0319 19:30:21.620498 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:30:21 crc kubenswrapper[5033]: E0319 19:30:21.621355 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:30:33 crc kubenswrapper[5033]: I0319 19:30:33.063949 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-9lpbv"] Mar 19 19:30:33 crc kubenswrapper[5033]: I0319 19:30:33.074116 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-9lpbv"] Mar 19 19:30:34 crc kubenswrapper[5033]: I0319 19:30:34.654611 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb5bcaa-3619-4584-b125-1f3d521ffb2c" path="/var/lib/kubelet/pods/9fb5bcaa-3619-4584-b125-1f3d521ffb2c/volumes" Mar 19 19:30:35 crc kubenswrapper[5033]: I0319 19:30:35.620692 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:30:35 crc kubenswrapper[5033]: E0319 19:30:35.621030 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:30:39 crc kubenswrapper[5033]: I0319 19:30:39.031799 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-c4ql9"] Mar 19 19:30:39 crc kubenswrapper[5033]: I0319 19:30:39.040179 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-c4ql9"] Mar 19 19:30:40 crc kubenswrapper[5033]: I0319 19:30:40.638662 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f861aa-a778-4f3d-bc7f-e4b74b10ca6a" path="/var/lib/kubelet/pods/53f861aa-a778-4f3d-bc7f-e4b74b10ca6a/volumes" Mar 19 19:30:46 crc kubenswrapper[5033]: I0319 19:30:46.620530 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:30:47 crc kubenswrapper[5033]: I0319 19:30:47.531219 5033 generic.go:334] "Generic (PLEG): container finished" podID="158eff83-6646-4490-8169-9d2c9c9cd06c" containerID="458b1c660bce9523ecfd2dda3814fb88af82b1d92b17f2119a1e5d3d5dc0b463" exitCode=0 Mar 19 19:30:47 crc kubenswrapper[5033]: I0319 19:30:47.531427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" event={"ID":"158eff83-6646-4490-8169-9d2c9c9cd06c","Type":"ContainerDied","Data":"458b1c660bce9523ecfd2dda3814fb88af82b1d92b17f2119a1e5d3d5dc0b463"} Mar 19 19:30:47 crc kubenswrapper[5033]: I0319 19:30:47.536325 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3"} Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.016513 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.208166 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory\") pod \"158eff83-6646-4490-8169-9d2c9c9cd06c\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.208304 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle\") pod \"158eff83-6646-4490-8169-9d2c9c9cd06c\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.208536 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0\") pod \"158eff83-6646-4490-8169-9d2c9c9cd06c\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.208624 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam\") pod \"158eff83-6646-4490-8169-9d2c9c9cd06c\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.208868 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cknb5\" (UniqueName: \"kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5\") pod \"158eff83-6646-4490-8169-9d2c9c9cd06c\" (UID: \"158eff83-6646-4490-8169-9d2c9c9cd06c\") " Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.220667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "158eff83-6646-4490-8169-9d2c9c9cd06c" (UID: "158eff83-6646-4490-8169-9d2c9c9cd06c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.220758 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5" (OuterVolumeSpecName: "kube-api-access-cknb5") pod "158eff83-6646-4490-8169-9d2c9c9cd06c" (UID: "158eff83-6646-4490-8169-9d2c9c9cd06c"). InnerVolumeSpecName "kube-api-access-cknb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.235599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "158eff83-6646-4490-8169-9d2c9c9cd06c" (UID: "158eff83-6646-4490-8169-9d2c9c9cd06c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.236698 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory" (OuterVolumeSpecName: "inventory") pod "158eff83-6646-4490-8169-9d2c9c9cd06c" (UID: "158eff83-6646-4490-8169-9d2c9c9cd06c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.241099 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "158eff83-6646-4490-8169-9d2c9c9cd06c" (UID: "158eff83-6646-4490-8169-9d2c9c9cd06c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.311873 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.311907 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.311917 5033 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/158eff83-6646-4490-8169-9d2c9c9cd06c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.311929 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/158eff83-6646-4490-8169-9d2c9c9cd06c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.311937 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cknb5\" (UniqueName: \"kubernetes.io/projected/158eff83-6646-4490-8169-9d2c9c9cd06c-kube-api-access-cknb5\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.552593 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" event={"ID":"158eff83-6646-4490-8169-9d2c9c9cd06c","Type":"ContainerDied","Data":"97576954f302f99dcc48002b921d7844028a3e2eda932e919230c8147aca90d2"} Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.552654 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97576954f302f99dcc48002b921d7844028a3e2eda932e919230c8147aca90d2" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.552627 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-495rs" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649013 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5"] Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649402 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="registry-server" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649420 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="registry-server" Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649434 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158eff83-6646-4490-8169-9d2c9c9cd06c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649441 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="158eff83-6646-4490-8169-9d2c9c9cd06c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649483 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95869176-9b34-468d-9078-0bf13e82b316" containerName="oc" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649491 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="95869176-9b34-468d-9078-0bf13e82b316" containerName="oc" Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="extract-utilities" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649506 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="extract-utilities" Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649517 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="extract-content" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649523 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="extract-content" Mar 19 19:30:49 crc kubenswrapper[5033]: E0319 19:30:49.649534 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d8a751-7b1a-4169-bbb2-a052502ec1af" containerName="collect-profiles" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d8a751-7b1a-4169-bbb2-a052502ec1af" containerName="collect-profiles" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649729 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="158eff83-6646-4490-8169-9d2c9c9cd06c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649744 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d8a751-7b1a-4169-bbb2-a052502ec1af" containerName="collect-profiles" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649758 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dc3f00-6d8e-4cc6-8894-70c717f67d1f" containerName="registry-server" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.649780 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="95869176-9b34-468d-9078-0bf13e82b316" containerName="oc" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.650654 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.652554 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.654106 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.654570 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.654752 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.655300 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.655880 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.662334 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5"] Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.826937 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.827053 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.827091 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xll47\" (UniqueName: \"kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.827149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.827175 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.827298 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.928734 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.928790 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.928893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.929023 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.929067 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.929100 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xll47\" (UniqueName: \"kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.933821 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.934390 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.938342 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.939347 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.941212 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:49 crc kubenswrapper[5033]: I0319 19:30:49.957534 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xll47\" (UniqueName: \"kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:50 crc kubenswrapper[5033]: I0319 19:30:50.020674 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:30:50 crc kubenswrapper[5033]: I0319 19:30:50.571578 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5"] Mar 19 19:30:50 crc kubenswrapper[5033]: W0319 19:30:50.578077 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b7506fc_a863_4730_be08_587061188731.slice/crio-44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df WatchSource:0}: Error finding container 44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df: Status 404 returned error can't find the container with id 44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df Mar 19 19:30:51 crc kubenswrapper[5033]: I0319 19:30:51.581158 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" event={"ID":"0b7506fc-a863-4730-be08-587061188731","Type":"ContainerStarted","Data":"8b41b8b1dd754693bd84f5d49d8f7fad2b6ed72534acc96444c4b0ae36d9a481"} Mar 19 19:30:51 crc kubenswrapper[5033]: I0319 19:30:51.581806 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" event={"ID":"0b7506fc-a863-4730-be08-587061188731","Type":"ContainerStarted","Data":"44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df"} Mar 19 19:30:51 crc kubenswrapper[5033]: I0319 19:30:51.609826 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" podStartSLOduration=2.1589132 podStartE2EDuration="2.609808909s" podCreationTimestamp="2026-03-19 19:30:49 +0000 UTC" firstStartedPulling="2026-03-19 19:30:50.581334658 +0000 UTC m=+2060.686364507" lastFinishedPulling="2026-03-19 19:30:51.032230367 +0000 UTC m=+2061.137260216" observedRunningTime="2026-03-19 19:30:51.601274909 +0000 UTC m=+2061.706304818" watchObservedRunningTime="2026-03-19 19:30:51.609808909 +0000 UTC m=+2061.714838758" Mar 19 19:31:15 crc kubenswrapper[5033]: I0319 19:31:15.676710 5033 scope.go:117] "RemoveContainer" containerID="1a790ba1233f7377994be247c27ae8f28531ed4c328257919d038a2548b48a1b" Mar 19 19:31:15 crc kubenswrapper[5033]: I0319 19:31:15.713161 5033 scope.go:117] "RemoveContainer" containerID="e0551935d2c9102ad1b47d80cf4386c012b279edd80bea174fcb2991f8ae8c9b" Mar 19 19:31:38 crc kubenswrapper[5033]: I0319 19:31:38.013827 5033 generic.go:334] "Generic (PLEG): container finished" podID="0b7506fc-a863-4730-be08-587061188731" containerID="8b41b8b1dd754693bd84f5d49d8f7fad2b6ed72534acc96444c4b0ae36d9a481" exitCode=0 Mar 19 19:31:38 crc kubenswrapper[5033]: I0319 19:31:38.014633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" event={"ID":"0b7506fc-a863-4730-be08-587061188731","Type":"ContainerDied","Data":"8b41b8b1dd754693bd84f5d49d8f7fad2b6ed72534acc96444c4b0ae36d9a481"} Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.548132 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.643722 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.644222 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.644349 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xll47\" (UniqueName: \"kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.644371 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.644491 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.644528 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory\") pod \"0b7506fc-a863-4730-be08-587061188731\" (UID: \"0b7506fc-a863-4730-be08-587061188731\") " Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.649763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47" (OuterVolumeSpecName: "kube-api-access-xll47") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "kube-api-access-xll47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.656807 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.674314 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory" (OuterVolumeSpecName: "inventory") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.675025 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.675108 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.711775 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0b7506fc-a863-4730-be08-587061188731" (UID: "0b7506fc-a863-4730-be08-587061188731"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746777 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746812 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746828 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746841 5033 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746852 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xll47\" (UniqueName: \"kubernetes.io/projected/0b7506fc-a863-4730-be08-587061188731-kube-api-access-xll47\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:39 crc kubenswrapper[5033]: I0319 19:31:39.746863 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7506fc-a863-4730-be08-587061188731-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.035413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" event={"ID":"0b7506fc-a863-4730-be08-587061188731","Type":"ContainerDied","Data":"44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df"} Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.035462 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.035482 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44388944b7b5e2beb826b9f6f2c0a63da604699134200449c4cd525b0cc709df" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.140734 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj"] Mar 19 19:31:40 crc kubenswrapper[5033]: E0319 19:31:40.141164 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7506fc-a863-4730-be08-587061188731" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.141180 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7506fc-a863-4730-be08-587061188731" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.141391 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7506fc-a863-4730-be08-587061188731" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.142163 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.145091 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.145494 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.145816 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.146008 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.151506 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.154018 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj"] Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.259476 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.259559 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.259679 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.259732 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.259905 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhv7\" (UniqueName: \"kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.361751 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.361866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.361943 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.362033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhv7\" (UniqueName: \"kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.362136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.367463 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.367963 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.368218 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.368247 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.378496 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhv7\" (UniqueName: \"kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-685cj\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:40 crc kubenswrapper[5033]: I0319 19:31:40.465824 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:31:41 crc kubenswrapper[5033]: I0319 19:31:41.069140 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj"] Mar 19 19:31:42 crc kubenswrapper[5033]: I0319 19:31:42.060468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" event={"ID":"197f45b2-0d11-4b18-ac55-d4fb3b29c09e","Type":"ContainerStarted","Data":"4103da94908010992828954b8f1a00e46e0d6b7f55ebc1999f1d514f6171c3a8"} Mar 19 19:31:42 crc kubenswrapper[5033]: I0319 19:31:42.060818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" event={"ID":"197f45b2-0d11-4b18-ac55-d4fb3b29c09e","Type":"ContainerStarted","Data":"c3d08a8cbe367e29ed6a20c35632773ad0f1b8a7fc40073b89fa4d31eb8e8f16"} Mar 19 19:31:42 crc kubenswrapper[5033]: I0319 19:31:42.091484 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" podStartSLOduration=1.673147004 podStartE2EDuration="2.09146367s" podCreationTimestamp="2026-03-19 19:31:40 +0000 UTC" firstStartedPulling="2026-03-19 19:31:41.078770052 +0000 UTC m=+2111.183799901" lastFinishedPulling="2026-03-19 19:31:41.497086678 +0000 UTC m=+2111.602116567" observedRunningTime="2026-03-19 19:31:42.082606862 +0000 UTC m=+2112.187636711" watchObservedRunningTime="2026-03-19 19:31:42.09146367 +0000 UTC m=+2112.196493519" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.077041 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.081229 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.091686 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.234653 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.235148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.235317 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56jp\" (UniqueName: \"kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.337532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56jp\" (UniqueName: \"kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.337687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.337736 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.338245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.338383 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.382301 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56jp\" (UniqueName: \"kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp\") pod \"community-operators-pcvf9\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.410323 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.470274 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.477426 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.486112 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.649365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.649427 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85l5n\" (UniqueName: \"kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.649656 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.751368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.751686 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85l5n\" (UniqueName: \"kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.751907 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.753188 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.753488 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.776203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85l5n\" (UniqueName: \"kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n\") pod \"certified-operators-c5l4v\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.852658 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:47 crc kubenswrapper[5033]: I0319 19:31:47.988483 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:31:48 crc kubenswrapper[5033]: I0319 19:31:48.141103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerStarted","Data":"b7a2a24da249175f1b85c7932d9218975d082a9fdea126774b543935d6540b39"} Mar 19 19:31:48 crc kubenswrapper[5033]: I0319 19:31:48.485186 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:31:49 crc kubenswrapper[5033]: I0319 19:31:49.152214 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerID="04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28" exitCode=0 Mar 19 19:31:49 crc kubenswrapper[5033]: I0319 19:31:49.152296 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerDied","Data":"04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28"} Mar 19 19:31:49 crc kubenswrapper[5033]: I0319 19:31:49.159109 5033 generic.go:334] "Generic (PLEG): container finished" podID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerID="ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1" exitCode=0 Mar 19 19:31:49 crc kubenswrapper[5033]: I0319 19:31:49.159152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerDied","Data":"ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1"} Mar 19 19:31:49 crc kubenswrapper[5033]: I0319 19:31:49.159177 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerStarted","Data":"41ae5a34c3ef319ce37c9772d355a8beef927b0b071b16b8e6b3c3910b563400"} Mar 19 19:31:51 crc kubenswrapper[5033]: I0319 19:31:51.180034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerStarted","Data":"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f"} Mar 19 19:31:52 crc kubenswrapper[5033]: I0319 19:31:52.199471 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerID="1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f" exitCode=0 Mar 19 19:31:52 crc kubenswrapper[5033]: I0319 19:31:52.199811 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerDied","Data":"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f"} Mar 19 19:31:56 crc kubenswrapper[5033]: I0319 19:31:56.235489 5033 generic.go:334] "Generic (PLEG): container finished" podID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerID="ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba" exitCode=0 Mar 19 19:31:56 crc kubenswrapper[5033]: I0319 19:31:56.235716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerDied","Data":"ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba"} Mar 19 19:31:56 crc kubenswrapper[5033]: I0319 19:31:56.240536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerStarted","Data":"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088"} Mar 19 19:31:56 crc kubenswrapper[5033]: I0319 19:31:56.288881 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pcvf9" podStartSLOduration=2.853494088 podStartE2EDuration="9.288860096s" podCreationTimestamp="2026-03-19 19:31:47 +0000 UTC" firstStartedPulling="2026-03-19 19:31:49.154940276 +0000 UTC m=+2119.259970115" lastFinishedPulling="2026-03-19 19:31:55.590306234 +0000 UTC m=+2125.695336123" observedRunningTime="2026-03-19 19:31:56.282675883 +0000 UTC m=+2126.387705722" watchObservedRunningTime="2026-03-19 19:31:56.288860096 +0000 UTC m=+2126.393889955" Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.250990 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerStarted","Data":"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b"} Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.279327 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c5l4v" podStartSLOduration=2.694108331 podStartE2EDuration="10.279302301s" podCreationTimestamp="2026-03-19 19:31:47 +0000 UTC" firstStartedPulling="2026-03-19 19:31:49.160844442 +0000 UTC m=+2119.265874291" lastFinishedPulling="2026-03-19 19:31:56.746038412 +0000 UTC m=+2126.851068261" observedRunningTime="2026-03-19 19:31:57.268177899 +0000 UTC m=+2127.373207748" watchObservedRunningTime="2026-03-19 19:31:57.279302301 +0000 UTC m=+2127.384332160" Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.410547 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.410882 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.853194 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:57 crc kubenswrapper[5033]: I0319 19:31:57.853423 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:31:58 crc kubenswrapper[5033]: I0319 19:31:58.469185 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pcvf9" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="registry-server" probeResult="failure" output=< Mar 19 19:31:58 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:31:58 crc kubenswrapper[5033]: > Mar 19 19:31:58 crc kubenswrapper[5033]: I0319 19:31:58.899076 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-c5l4v" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="registry-server" probeResult="failure" output=< Mar 19 19:31:58 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:31:58 crc kubenswrapper[5033]: > Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.132946 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565812-vpvdv"] Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.135020 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.138074 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.138316 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.138340 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.143927 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-vpvdv"] Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.213960 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nl5h\" (UniqueName: \"kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h\") pod \"auto-csr-approver-29565812-vpvdv\" (UID: \"319a2279-ecbd-46a0-901f-b8dd08654a38\") " pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.316259 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nl5h\" (UniqueName: \"kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h\") pod \"auto-csr-approver-29565812-vpvdv\" (UID: \"319a2279-ecbd-46a0-901f-b8dd08654a38\") " pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.333718 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nl5h\" (UniqueName: \"kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h\") pod \"auto-csr-approver-29565812-vpvdv\" (UID: \"319a2279-ecbd-46a0-901f-b8dd08654a38\") " pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.456140 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:00 crc kubenswrapper[5033]: I0319 19:32:00.966632 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-vpvdv"] Mar 19 19:32:01 crc kubenswrapper[5033]: I0319 19:32:01.289624 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" event={"ID":"319a2279-ecbd-46a0-901f-b8dd08654a38","Type":"ContainerStarted","Data":"17ef368e4e58a34e05336b667d5acd570f13db93b84330c403d32655a585a51d"} Mar 19 19:32:03 crc kubenswrapper[5033]: I0319 19:32:03.310814 5033 generic.go:334] "Generic (PLEG): container finished" podID="319a2279-ecbd-46a0-901f-b8dd08654a38" containerID="719fc800760b7d117b93f55d7c38d5282a2ac415435d4fd16c74a6da4aed323e" exitCode=0 Mar 19 19:32:03 crc kubenswrapper[5033]: I0319 19:32:03.310944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" event={"ID":"319a2279-ecbd-46a0-901f-b8dd08654a38","Type":"ContainerDied","Data":"719fc800760b7d117b93f55d7c38d5282a2ac415435d4fd16c74a6da4aed323e"} Mar 19 19:32:04 crc kubenswrapper[5033]: I0319 19:32:04.767046 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:04 crc kubenswrapper[5033]: I0319 19:32:04.939356 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nl5h\" (UniqueName: \"kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h\") pod \"319a2279-ecbd-46a0-901f-b8dd08654a38\" (UID: \"319a2279-ecbd-46a0-901f-b8dd08654a38\") " Mar 19 19:32:04 crc kubenswrapper[5033]: I0319 19:32:04.945427 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h" (OuterVolumeSpecName: "kube-api-access-8nl5h") pod "319a2279-ecbd-46a0-901f-b8dd08654a38" (UID: "319a2279-ecbd-46a0-901f-b8dd08654a38"). InnerVolumeSpecName "kube-api-access-8nl5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.043498 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nl5h\" (UniqueName: \"kubernetes.io/projected/319a2279-ecbd-46a0-901f-b8dd08654a38-kube-api-access-8nl5h\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.328853 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" event={"ID":"319a2279-ecbd-46a0-901f-b8dd08654a38","Type":"ContainerDied","Data":"17ef368e4e58a34e05336b667d5acd570f13db93b84330c403d32655a585a51d"} Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.328888 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ef368e4e58a34e05336b667d5acd570f13db93b84330c403d32655a585a51d" Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.329398 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-vpvdv" Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.829286 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-w24s6"] Mar 19 19:32:05 crc kubenswrapper[5033]: I0319 19:32:05.838081 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-w24s6"] Mar 19 19:32:06 crc kubenswrapper[5033]: I0319 19:32:06.631010 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2aaa5eb-4661-4a7f-9aa8-37dcde93526c" path="/var/lib/kubelet/pods/a2aaa5eb-4661-4a7f-9aa8-37dcde93526c/volumes" Mar 19 19:32:07 crc kubenswrapper[5033]: I0319 19:32:07.464150 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:32:07 crc kubenswrapper[5033]: I0319 19:32:07.528811 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:32:07 crc kubenswrapper[5033]: I0319 19:32:07.712257 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:32:07 crc kubenswrapper[5033]: I0319 19:32:07.920354 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:32:07 crc kubenswrapper[5033]: I0319 19:32:07.970497 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:32:09 crc kubenswrapper[5033]: I0319 19:32:09.371564 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pcvf9" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="registry-server" containerID="cri-o://beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088" gracePeriod=2 Mar 19 19:32:09 crc kubenswrapper[5033]: I0319 19:32:09.938469 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.104353 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.126672 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.126944 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jd5t" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="registry-server" containerID="cri-o://bcfa8c2282c288d7eac3c600fb5994c8cff8bc1719c18bed883e208ef8e7f5a0" gracePeriod=2 Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.175905 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities\") pod \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.175968 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56jp\" (UniqueName: \"kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp\") pod \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.176075 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content\") pod \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\" (UID: \"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.176533 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities" (OuterVolumeSpecName: "utilities") pod "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" (UID: "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.189657 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp" (OuterVolumeSpecName: "kube-api-access-q56jp") pod "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" (UID: "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8"). InnerVolumeSpecName "kube-api-access-q56jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.267871 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" (UID: "f1210a57-39d8-48a7-a6b5-67ef7b1cbea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.278861 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.278900 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.278911 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56jp\" (UniqueName: \"kubernetes.io/projected/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8-kube-api-access-q56jp\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.387708 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerID="beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088" exitCode=0 Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.387754 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerDied","Data":"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088"} Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.387821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcvf9" event={"ID":"f1210a57-39d8-48a7-a6b5-67ef7b1cbea8","Type":"ContainerDied","Data":"b7a2a24da249175f1b85c7932d9218975d082a9fdea126774b543935d6540b39"} Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.387843 5033 scope.go:117] "RemoveContainer" containerID="beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.387852 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcvf9" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.399046 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerID="bcfa8c2282c288d7eac3c600fb5994c8cff8bc1719c18bed883e208ef8e7f5a0" exitCode=0 Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.399099 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerDied","Data":"bcfa8c2282c288d7eac3c600fb5994c8cff8bc1719c18bed883e208ef8e7f5a0"} Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.418548 5033 scope.go:117] "RemoveContainer" containerID="1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.437172 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.447580 5033 scope.go:117] "RemoveContainer" containerID="04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.455310 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pcvf9"] Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.516561 5033 scope.go:117] "RemoveContainer" containerID="beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088" Mar 19 19:32:10 crc kubenswrapper[5033]: E0319 19:32:10.525131 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088\": container with ID starting with beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088 not found: ID does not exist" containerID="beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.525176 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088"} err="failed to get container status \"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088\": rpc error: code = NotFound desc = could not find container \"beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088\": container with ID starting with beaccc8c28558c0add51ef4b67d702db03ad6dd0705115c86d828945965ae088 not found: ID does not exist" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.525199 5033 scope.go:117] "RemoveContainer" containerID="1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f" Mar 19 19:32:10 crc kubenswrapper[5033]: E0319 19:32:10.525851 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f\": container with ID starting with 1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f not found: ID does not exist" containerID="1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.525873 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f"} err="failed to get container status \"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f\": rpc error: code = NotFound desc = could not find container \"1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f\": container with ID starting with 1323c1efeb7303f1e26df04b78c2574b5df5ec064d47acac4304618faf6b139f not found: ID does not exist" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.525887 5033 scope.go:117] "RemoveContainer" containerID="04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28" Mar 19 19:32:10 crc kubenswrapper[5033]: E0319 19:32:10.526255 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28\": container with ID starting with 04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28 not found: ID does not exist" containerID="04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.526275 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28"} err="failed to get container status \"04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28\": rpc error: code = NotFound desc = could not find container \"04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28\": container with ID starting with 04c8fafc027804ef610551c4f31b1f71b93df3e70e562124f43cb551b6a4cb28 not found: ID does not exist" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.643810 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" path="/var/lib/kubelet/pods/f1210a57-39d8-48a7-a6b5-67ef7b1cbea8/volumes" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.705716 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.819366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ktj\" (UniqueName: \"kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj\") pod \"9ae96f2f-8565-4e27-a05b-c273e4859d74\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.819494 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities\") pod \"9ae96f2f-8565-4e27-a05b-c273e4859d74\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.819555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content\") pod \"9ae96f2f-8565-4e27-a05b-c273e4859d74\" (UID: \"9ae96f2f-8565-4e27-a05b-c273e4859d74\") " Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.821722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities" (OuterVolumeSpecName: "utilities") pod "9ae96f2f-8565-4e27-a05b-c273e4859d74" (UID: "9ae96f2f-8565-4e27-a05b-c273e4859d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.828639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj" (OuterVolumeSpecName: "kube-api-access-r8ktj") pod "9ae96f2f-8565-4e27-a05b-c273e4859d74" (UID: "9ae96f2f-8565-4e27-a05b-c273e4859d74"). InnerVolumeSpecName "kube-api-access-r8ktj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.884734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ae96f2f-8565-4e27-a05b-c273e4859d74" (UID: "9ae96f2f-8565-4e27-a05b-c273e4859d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.922042 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ktj\" (UniqueName: \"kubernetes.io/projected/9ae96f2f-8565-4e27-a05b-c273e4859d74-kube-api-access-r8ktj\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.922079 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:10 crc kubenswrapper[5033]: I0319 19:32:10.922089 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ae96f2f-8565-4e27-a05b-c273e4859d74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.413978 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jd5t" event={"ID":"9ae96f2f-8565-4e27-a05b-c273e4859d74","Type":"ContainerDied","Data":"c87cd7f40a3522acdaea750225bcad1570a4008c56fbf27051da0857e253bba8"} Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.414061 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jd5t" Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.414326 5033 scope.go:117] "RemoveContainer" containerID="bcfa8c2282c288d7eac3c600fb5994c8cff8bc1719c18bed883e208ef8e7f5a0" Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.447971 5033 scope.go:117] "RemoveContainer" containerID="6b9fe82c963ba5333b073be36284bc19244d2958a0f421a2001876d924418039" Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.454297 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.462855 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jd5t"] Mar 19 19:32:11 crc kubenswrapper[5033]: I0319 19:32:11.470751 5033 scope.go:117] "RemoveContainer" containerID="f2e71ca4a03e3b27ba111ccc31b123ad929f18a288ee1626b8ac193467f3f95d" Mar 19 19:32:12 crc kubenswrapper[5033]: I0319 19:32:12.631378 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" path="/var/lib/kubelet/pods/9ae96f2f-8565-4e27-a05b-c273e4859d74/volumes" Mar 19 19:32:15 crc kubenswrapper[5033]: I0319 19:32:15.817774 5033 scope.go:117] "RemoveContainer" containerID="58c2a5a536b777c21d76b253bfb70bea4e61315e54c211496b556d31f0b2fb39" Mar 19 19:33:10 crc kubenswrapper[5033]: I0319 19:33:10.758708 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:33:10 crc kubenswrapper[5033]: I0319 19:33:10.759270 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:33:40 crc kubenswrapper[5033]: I0319 19:33:40.760862 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:33:40 crc kubenswrapper[5033]: I0319 19:33:40.763627 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.160362 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565814-lwht5"] Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.161819 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.161849 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.161877 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.161894 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.161926 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.161942 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.161965 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.161978 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.161998 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162040 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.162063 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162075 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: E0319 19:34:00.162126 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319a2279-ecbd-46a0-901f-b8dd08654a38" containerName="oc" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162142 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="319a2279-ecbd-46a0-901f-b8dd08654a38" containerName="oc" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162614 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae96f2f-8565-4e27-a05b-c273e4859d74" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162661 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1210a57-39d8-48a7-a6b5-67ef7b1cbea8" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.162695 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="319a2279-ecbd-46a0-901f-b8dd08654a38" containerName="oc" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.164165 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.176054 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-lwht5"] Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.180276 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.184344 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.184482 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.260985 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56n5\" (UniqueName: \"kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5\") pod \"auto-csr-approver-29565814-lwht5\" (UID: \"00b0e341-629b-43bd-a91d-b4e716e4aa5f\") " pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.362909 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56n5\" (UniqueName: \"kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5\") pod \"auto-csr-approver-29565814-lwht5\" (UID: \"00b0e341-629b-43bd-a91d-b4e716e4aa5f\") " pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.384048 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56n5\" (UniqueName: \"kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5\") pod \"auto-csr-approver-29565814-lwht5\" (UID: \"00b0e341-629b-43bd-a91d-b4e716e4aa5f\") " pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.500813 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:00 crc kubenswrapper[5033]: I0319 19:34:00.941660 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-lwht5"] Mar 19 19:34:01 crc kubenswrapper[5033]: I0319 19:34:01.704391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-lwht5" event={"ID":"00b0e341-629b-43bd-a91d-b4e716e4aa5f","Type":"ContainerStarted","Data":"723f864fec0acff1815ea93e14b85a222b0b89cde01b1c9766d2ac7e78fd0464"} Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.693627 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.696058 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.710339 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.713803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdn8\" (UniqueName: \"kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.713890 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.714030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.722022 5033 generic.go:334] "Generic (PLEG): container finished" podID="00b0e341-629b-43bd-a91d-b4e716e4aa5f" containerID="a28df8b72a3c925bce9a71678d2102496d59752b0bb7339a55fa71a7e4408a45" exitCode=0 Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.722067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-lwht5" event={"ID":"00b0e341-629b-43bd-a91d-b4e716e4aa5f","Type":"ContainerDied","Data":"a28df8b72a3c925bce9a71678d2102496d59752b0bb7339a55fa71a7e4408a45"} Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.816101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.816235 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdn8\" (UniqueName: \"kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.816286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.816672 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.817161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:02 crc kubenswrapper[5033]: I0319 19:34:02.838343 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdn8\" (UniqueName: \"kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8\") pod \"redhat-marketplace-w89pj\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:03 crc kubenswrapper[5033]: I0319 19:34:03.067095 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:03 crc kubenswrapper[5033]: I0319 19:34:03.550758 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:03 crc kubenswrapper[5033]: I0319 19:34:03.733042 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerStarted","Data":"f37bf6f890233c8446a46b7f8400903334233039cf34cc6e4b88662955161b9c"} Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.154840 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.348222 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56n5\" (UniqueName: \"kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5\") pod \"00b0e341-629b-43bd-a91d-b4e716e4aa5f\" (UID: \"00b0e341-629b-43bd-a91d-b4e716e4aa5f\") " Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.354140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5" (OuterVolumeSpecName: "kube-api-access-t56n5") pod "00b0e341-629b-43bd-a91d-b4e716e4aa5f" (UID: "00b0e341-629b-43bd-a91d-b4e716e4aa5f"). InnerVolumeSpecName "kube-api-access-t56n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.450938 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t56n5\" (UniqueName: \"kubernetes.io/projected/00b0e341-629b-43bd-a91d-b4e716e4aa5f-kube-api-access-t56n5\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.743839 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-lwht5" Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.743845 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-lwht5" event={"ID":"00b0e341-629b-43bd-a91d-b4e716e4aa5f","Type":"ContainerDied","Data":"723f864fec0acff1815ea93e14b85a222b0b89cde01b1c9766d2ac7e78fd0464"} Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.743903 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723f864fec0acff1815ea93e14b85a222b0b89cde01b1c9766d2ac7e78fd0464" Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.745559 5033 generic.go:334] "Generic (PLEG): container finished" podID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerID="3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3" exitCode=0 Mar 19 19:34:04 crc kubenswrapper[5033]: I0319 19:34:04.745607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerDied","Data":"3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3"} Mar 19 19:34:05 crc kubenswrapper[5033]: I0319 19:34:05.249386 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-zlrn8"] Mar 19 19:34:05 crc kubenswrapper[5033]: I0319 19:34:05.258919 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-zlrn8"] Mar 19 19:34:05 crc kubenswrapper[5033]: I0319 19:34:05.756040 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerStarted","Data":"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c"} Mar 19 19:34:06 crc kubenswrapper[5033]: I0319 19:34:06.631905 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048ce385-f25d-4fca-9508-7db919f7cb5f" path="/var/lib/kubelet/pods/048ce385-f25d-4fca-9508-7db919f7cb5f/volumes" Mar 19 19:34:06 crc kubenswrapper[5033]: I0319 19:34:06.767394 5033 generic.go:334] "Generic (PLEG): container finished" podID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerID="9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c" exitCode=0 Mar 19 19:34:06 crc kubenswrapper[5033]: I0319 19:34:06.767433 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerDied","Data":"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c"} Mar 19 19:34:07 crc kubenswrapper[5033]: I0319 19:34:07.777887 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerStarted","Data":"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893"} Mar 19 19:34:07 crc kubenswrapper[5033]: I0319 19:34:07.797273 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w89pj" podStartSLOduration=3.376468789 podStartE2EDuration="5.797257618s" podCreationTimestamp="2026-03-19 19:34:02 +0000 UTC" firstStartedPulling="2026-03-19 19:34:04.747149052 +0000 UTC m=+2254.852178891" lastFinishedPulling="2026-03-19 19:34:07.167937871 +0000 UTC m=+2257.272967720" observedRunningTime="2026-03-19 19:34:07.793546184 +0000 UTC m=+2257.898576023" watchObservedRunningTime="2026-03-19 19:34:07.797257618 +0000 UTC m=+2257.902287467" Mar 19 19:34:10 crc kubenswrapper[5033]: I0319 19:34:10.759070 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:34:10 crc kubenswrapper[5033]: I0319 19:34:10.760012 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:34:10 crc kubenswrapper[5033]: I0319 19:34:10.760061 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:34:10 crc kubenswrapper[5033]: I0319 19:34:10.760869 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:34:10 crc kubenswrapper[5033]: I0319 19:34:10.760917 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3" gracePeriod=600 Mar 19 19:34:11 crc kubenswrapper[5033]: I0319 19:34:11.823900 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3" exitCode=0 Mar 19 19:34:11 crc kubenswrapper[5033]: I0319 19:34:11.823986 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3"} Mar 19 19:34:11 crc kubenswrapper[5033]: I0319 19:34:11.824537 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0"} Mar 19 19:34:11 crc kubenswrapper[5033]: I0319 19:34:11.824563 5033 scope.go:117] "RemoveContainer" containerID="cb92bcbc0abda3471d55f855a080d6ee63ffab4c49578e14eef7bcb2bab827b6" Mar 19 19:34:13 crc kubenswrapper[5033]: I0319 19:34:13.068067 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:13 crc kubenswrapper[5033]: I0319 19:34:13.069480 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:13 crc kubenswrapper[5033]: I0319 19:34:13.117693 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:13 crc kubenswrapper[5033]: I0319 19:34:13.893966 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:13 crc kubenswrapper[5033]: I0319 19:34:13.944898 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:15 crc kubenswrapper[5033]: I0319 19:34:15.874233 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w89pj" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="registry-server" containerID="cri-o://8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893" gracePeriod=2 Mar 19 19:34:15 crc kubenswrapper[5033]: I0319 19:34:15.954772 5033 scope.go:117] "RemoveContainer" containerID="b6b2f3c156845ff9541d095075f5418ab1c766d5fa85178729e9cc43709f4524" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.486597 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.490095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities\") pod \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.490422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdn8\" (UniqueName: \"kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8\") pod \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.490472 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content\") pod \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\" (UID: \"a086fc97-3c0e-49bb-b05f-68966f2adcc5\") " Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.491052 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities" (OuterVolumeSpecName: "utilities") pod "a086fc97-3c0e-49bb-b05f-68966f2adcc5" (UID: "a086fc97-3c0e-49bb-b05f-68966f2adcc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.499150 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8" (OuterVolumeSpecName: "kube-api-access-6tdn8") pod "a086fc97-3c0e-49bb-b05f-68966f2adcc5" (UID: "a086fc97-3c0e-49bb-b05f-68966f2adcc5"). InnerVolumeSpecName "kube-api-access-6tdn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.517691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a086fc97-3c0e-49bb-b05f-68966f2adcc5" (UID: "a086fc97-3c0e-49bb-b05f-68966f2adcc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.592079 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdn8\" (UniqueName: \"kubernetes.io/projected/a086fc97-3c0e-49bb-b05f-68966f2adcc5-kube-api-access-6tdn8\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.592363 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.592372 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a086fc97-3c0e-49bb-b05f-68966f2adcc5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.889312 5033 generic.go:334] "Generic (PLEG): container finished" podID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerID="8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893" exitCode=0 Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.889372 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerDied","Data":"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893"} Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.889410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w89pj" event={"ID":"a086fc97-3c0e-49bb-b05f-68966f2adcc5","Type":"ContainerDied","Data":"f37bf6f890233c8446a46b7f8400903334233039cf34cc6e4b88662955161b9c"} Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.889436 5033 scope.go:117] "RemoveContainer" containerID="8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.889635 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w89pj" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.936293 5033 scope.go:117] "RemoveContainer" containerID="9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c" Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.955679 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.968714 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w89pj"] Mar 19 19:34:16 crc kubenswrapper[5033]: I0319 19:34:16.970200 5033 scope.go:117] "RemoveContainer" containerID="3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.037719 5033 scope.go:117] "RemoveContainer" containerID="8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893" Mar 19 19:34:17 crc kubenswrapper[5033]: E0319 19:34:17.038089 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893\": container with ID starting with 8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893 not found: ID does not exist" containerID="8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.038122 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893"} err="failed to get container status \"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893\": rpc error: code = NotFound desc = could not find container \"8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893\": container with ID starting with 8d7076fe265b2ec59d5937ef510d8d66d32f6199a9b1cda00e770f2a2f8d7893 not found: ID does not exist" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.038143 5033 scope.go:117] "RemoveContainer" containerID="9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c" Mar 19 19:34:17 crc kubenswrapper[5033]: E0319 19:34:17.038593 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c\": container with ID starting with 9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c not found: ID does not exist" containerID="9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.038636 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c"} err="failed to get container status \"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c\": rpc error: code = NotFound desc = could not find container \"9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c\": container with ID starting with 9b70f81322547245a58c7bc6e4989fce75c4687d1346ee931a6a207612ce010c not found: ID does not exist" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.038666 5033 scope.go:117] "RemoveContainer" containerID="3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3" Mar 19 19:34:17 crc kubenswrapper[5033]: E0319 19:34:17.038974 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3\": container with ID starting with 3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3 not found: ID does not exist" containerID="3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3" Mar 19 19:34:17 crc kubenswrapper[5033]: I0319 19:34:17.039001 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3"} err="failed to get container status \"3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3\": rpc error: code = NotFound desc = could not find container \"3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3\": container with ID starting with 3fcdb8f5550424bec1d008e981ca1a94848e1fad55b8b36886e157f59f47e3d3 not found: ID does not exist" Mar 19 19:34:18 crc kubenswrapper[5033]: I0319 19:34:18.634984 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" path="/var/lib/kubelet/pods/a086fc97-3c0e-49bb-b05f-68966f2adcc5/volumes" Mar 19 19:35:18 crc kubenswrapper[5033]: I0319 19:35:18.517924 5033 generic.go:334] "Generic (PLEG): container finished" podID="197f45b2-0d11-4b18-ac55-d4fb3b29c09e" containerID="4103da94908010992828954b8f1a00e46e0d6b7f55ebc1999f1d514f6171c3a8" exitCode=0 Mar 19 19:35:18 crc kubenswrapper[5033]: I0319 19:35:18.518003 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" event={"ID":"197f45b2-0d11-4b18-ac55-d4fb3b29c09e","Type":"ContainerDied","Data":"4103da94908010992828954b8f1a00e46e0d6b7f55ebc1999f1d514f6171c3a8"} Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.050844 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.180771 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle\") pod \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.180858 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0\") pod \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.180993 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam\") pod \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.181033 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory\") pod \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.181112 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmhv7\" (UniqueName: \"kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7\") pod \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\" (UID: \"197f45b2-0d11-4b18-ac55-d4fb3b29c09e\") " Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.187165 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "197f45b2-0d11-4b18-ac55-d4fb3b29c09e" (UID: "197f45b2-0d11-4b18-ac55-d4fb3b29c09e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.187183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7" (OuterVolumeSpecName: "kube-api-access-gmhv7") pod "197f45b2-0d11-4b18-ac55-d4fb3b29c09e" (UID: "197f45b2-0d11-4b18-ac55-d4fb3b29c09e"). InnerVolumeSpecName "kube-api-access-gmhv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.209470 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "197f45b2-0d11-4b18-ac55-d4fb3b29c09e" (UID: "197f45b2-0d11-4b18-ac55-d4fb3b29c09e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.214496 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "197f45b2-0d11-4b18-ac55-d4fb3b29c09e" (UID: "197f45b2-0d11-4b18-ac55-d4fb3b29c09e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.221382 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory" (OuterVolumeSpecName: "inventory") pod "197f45b2-0d11-4b18-ac55-d4fb3b29c09e" (UID: "197f45b2-0d11-4b18-ac55-d4fb3b29c09e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.284741 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.285088 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.285107 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmhv7\" (UniqueName: \"kubernetes.io/projected/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-kube-api-access-gmhv7\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.285120 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.285141 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/197f45b2-0d11-4b18-ac55-d4fb3b29c09e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.541128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" event={"ID":"197f45b2-0d11-4b18-ac55-d4fb3b29c09e","Type":"ContainerDied","Data":"c3d08a8cbe367e29ed6a20c35632773ad0f1b8a7fc40073b89fa4d31eb8e8f16"} Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.541166 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d08a8cbe367e29ed6a20c35632773ad0f1b8a7fc40073b89fa4d31eb8e8f16" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.541172 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-685cj" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642510 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr"] Mar 19 19:35:20 crc kubenswrapper[5033]: E0319 19:35:20.642832 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197f45b2-0d11-4b18-ac55-d4fb3b29c09e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642860 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="197f45b2-0d11-4b18-ac55-d4fb3b29c09e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:20 crc kubenswrapper[5033]: E0319 19:35:20.642895 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="extract-utilities" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642902 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="extract-utilities" Mar 19 19:35:20 crc kubenswrapper[5033]: E0319 19:35:20.642914 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b0e341-629b-43bd-a91d-b4e716e4aa5f" containerName="oc" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642922 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b0e341-629b-43bd-a91d-b4e716e4aa5f" containerName="oc" Mar 19 19:35:20 crc kubenswrapper[5033]: E0319 19:35:20.642942 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="extract-content" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642949 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="extract-content" Mar 19 19:35:20 crc kubenswrapper[5033]: E0319 19:35:20.642966 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="registry-server" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.642975 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="registry-server" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.643157 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a086fc97-3c0e-49bb-b05f-68966f2adcc5" containerName="registry-server" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.643167 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b0e341-629b-43bd-a91d-b4e716e4aa5f" containerName="oc" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.643179 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="197f45b2-0d11-4b18-ac55-d4fb3b29c09e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.643898 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.646299 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.646614 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.646629 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.646803 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.646907 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.648177 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.652414 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.655035 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr"] Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.796999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797109 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797210 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797239 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797272 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797314 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797336 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xj5\" (UniqueName: \"kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797383 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.797410 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898701 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898795 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xj5\" (UniqueName: \"kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.898965 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.899009 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.899051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.899083 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.900817 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.903188 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.904018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.904038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.904100 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.904176 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.904745 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.905025 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.906372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.907076 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.916550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xj5\" (UniqueName: \"kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-tplwr\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:20 crc kubenswrapper[5033]: I0319 19:35:20.969054 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:35:21 crc kubenswrapper[5033]: I0319 19:35:21.506188 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr"] Mar 19 19:35:21 crc kubenswrapper[5033]: I0319 19:35:21.508128 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:35:21 crc kubenswrapper[5033]: I0319 19:35:21.551442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" event={"ID":"44718e74-5ed8-41f2-880d-2b72e10d8cb4","Type":"ContainerStarted","Data":"65c90a8449d3d601c790294026cac1d28e3451cc2a4ea1c53709eacf5ea87727"} Mar 19 19:35:23 crc kubenswrapper[5033]: I0319 19:35:23.572567 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" event={"ID":"44718e74-5ed8-41f2-880d-2b72e10d8cb4","Type":"ContainerStarted","Data":"f862b78c443c38b2a69c6a67f1525831a717457065b12a7bf8795127ad7076a1"} Mar 19 19:35:23 crc kubenswrapper[5033]: I0319 19:35:23.596862 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" podStartSLOduration=2.843503031 podStartE2EDuration="3.596845073s" podCreationTimestamp="2026-03-19 19:35:20 +0000 UTC" firstStartedPulling="2026-03-19 19:35:21.507842516 +0000 UTC m=+2331.612872375" lastFinishedPulling="2026-03-19 19:35:22.261184568 +0000 UTC m=+2332.366214417" observedRunningTime="2026-03-19 19:35:23.58715753 +0000 UTC m=+2333.692187399" watchObservedRunningTime="2026-03-19 19:35:23.596845073 +0000 UTC m=+2333.701874922" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.146024 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565816-tmd9q"] Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.148865 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.152265 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.152822 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.153037 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.156230 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-tmd9q"] Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.232166 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x6vt\" (UniqueName: \"kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt\") pod \"auto-csr-approver-29565816-tmd9q\" (UID: \"851856fc-535e-439c-a06e-df5962717a15\") " pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.333805 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x6vt\" (UniqueName: \"kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt\") pod \"auto-csr-approver-29565816-tmd9q\" (UID: \"851856fc-535e-439c-a06e-df5962717a15\") " pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.353582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x6vt\" (UniqueName: \"kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt\") pod \"auto-csr-approver-29565816-tmd9q\" (UID: \"851856fc-535e-439c-a06e-df5962717a15\") " pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.485127 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:00 crc kubenswrapper[5033]: I0319 19:36:00.979987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-tmd9q"] Mar 19 19:36:01 crc kubenswrapper[5033]: I0319 19:36:01.937346 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" event={"ID":"851856fc-535e-439c-a06e-df5962717a15","Type":"ContainerStarted","Data":"d0ad6100428568c0dd1d2c38475530edba4d4fdbda21e959578c8c191d7a10e7"} Mar 19 19:36:02 crc kubenswrapper[5033]: I0319 19:36:02.949916 5033 generic.go:334] "Generic (PLEG): container finished" podID="851856fc-535e-439c-a06e-df5962717a15" containerID="f8deb9d56461cf262278de5ac87943ae059e3b464aaa0127790a9913ef6a1165" exitCode=0 Mar 19 19:36:02 crc kubenswrapper[5033]: I0319 19:36:02.950183 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" event={"ID":"851856fc-535e-439c-a06e-df5962717a15","Type":"ContainerDied","Data":"f8deb9d56461cf262278de5ac87943ae059e3b464aaa0127790a9913ef6a1165"} Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.316828 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.417774 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x6vt\" (UniqueName: \"kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt\") pod \"851856fc-535e-439c-a06e-df5962717a15\" (UID: \"851856fc-535e-439c-a06e-df5962717a15\") " Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.422957 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt" (OuterVolumeSpecName: "kube-api-access-2x6vt") pod "851856fc-535e-439c-a06e-df5962717a15" (UID: "851856fc-535e-439c-a06e-df5962717a15"). InnerVolumeSpecName "kube-api-access-2x6vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.520503 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x6vt\" (UniqueName: \"kubernetes.io/projected/851856fc-535e-439c-a06e-df5962717a15-kube-api-access-2x6vt\") on node \"crc\" DevicePath \"\"" Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.971715 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" event={"ID":"851856fc-535e-439c-a06e-df5962717a15","Type":"ContainerDied","Data":"d0ad6100428568c0dd1d2c38475530edba4d4fdbda21e959578c8c191d7a10e7"} Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.971757 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ad6100428568c0dd1d2c38475530edba4d4fdbda21e959578c8c191d7a10e7" Mar 19 19:36:04 crc kubenswrapper[5033]: I0319 19:36:04.971831 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-tmd9q" Mar 19 19:36:05 crc kubenswrapper[5033]: I0319 19:36:05.395724 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-tgflk"] Mar 19 19:36:05 crc kubenswrapper[5033]: I0319 19:36:05.404007 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-tgflk"] Mar 19 19:36:06 crc kubenswrapper[5033]: I0319 19:36:06.632194 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95869176-9b34-468d-9078-0bf13e82b316" path="/var/lib/kubelet/pods/95869176-9b34-468d-9078-0bf13e82b316/volumes" Mar 19 19:36:16 crc kubenswrapper[5033]: I0319 19:36:16.173308 5033 scope.go:117] "RemoveContainer" containerID="517b4ccb2deb7f744bf4998d7aeecfc12e064191bf88431bcd33db8bebbb2fca" Mar 19 19:36:40 crc kubenswrapper[5033]: I0319 19:36:40.759263 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:36:40 crc kubenswrapper[5033]: I0319 19:36:40.759919 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:37:10 crc kubenswrapper[5033]: I0319 19:37:10.758510 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:37:10 crc kubenswrapper[5033]: I0319 19:37:10.758994 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:37:38 crc kubenswrapper[5033]: I0319 19:37:38.972681 5033 generic.go:334] "Generic (PLEG): container finished" podID="44718e74-5ed8-41f2-880d-2b72e10d8cb4" containerID="f862b78c443c38b2a69c6a67f1525831a717457065b12a7bf8795127ad7076a1" exitCode=0 Mar 19 19:37:38 crc kubenswrapper[5033]: I0319 19:37:38.972762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" event={"ID":"44718e74-5ed8-41f2-880d-2b72e10d8cb4","Type":"ContainerDied","Data":"f862b78c443c38b2a69c6a67f1525831a717457065b12a7bf8795127ad7076a1"} Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.475009 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571091 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571167 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571231 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571265 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571378 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571432 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571503 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571535 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5xj5\" (UniqueName: \"kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.571574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.572266 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1\") pod \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\" (UID: \"44718e74-5ed8-41f2-880d-2b72e10d8cb4\") " Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.577003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5" (OuterVolumeSpecName: "kube-api-access-s5xj5") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "kube-api-access-s5xj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.590151 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.602288 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.602730 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.604596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.611192 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.614028 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.616093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.625675 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.632726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.632738 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory" (OuterVolumeSpecName: "inventory") pod "44718e74-5ed8-41f2-880d-2b72e10d8cb4" (UID: "44718e74-5ed8-41f2-880d-2b72e10d8cb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.674945 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.674985 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675000 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5xj5\" (UniqueName: \"kubernetes.io/projected/44718e74-5ed8-41f2-880d-2b72e10d8cb4-kube-api-access-s5xj5\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675011 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675022 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675034 5033 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675045 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675056 5033 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675067 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675079 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.675090 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44718e74-5ed8-41f2-880d-2b72e10d8cb4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.758429 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.758544 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.758594 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.759321 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.759369 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" gracePeriod=600 Mar 19 19:37:40 crc kubenswrapper[5033]: E0319 19:37:40.875809 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.990739 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" exitCode=0 Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.990828 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0"} Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.990900 5033 scope.go:117] "RemoveContainer" containerID="4ed30790da4dc475128c1ab38a4ada40a3de465c419f690dc7fa81af792206b3" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.991512 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:37:40 crc kubenswrapper[5033]: E0319 19:37:40.991764 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.992077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" event={"ID":"44718e74-5ed8-41f2-880d-2b72e10d8cb4","Type":"ContainerDied","Data":"65c90a8449d3d601c790294026cac1d28e3451cc2a4ea1c53709eacf5ea87727"} Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.992100 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65c90a8449d3d601c790294026cac1d28e3451cc2a4ea1c53709eacf5ea87727" Mar 19 19:37:40 crc kubenswrapper[5033]: I0319 19:37:40.992149 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-tplwr" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.124646 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f"] Mar 19 19:37:41 crc kubenswrapper[5033]: E0319 19:37:41.125206 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851856fc-535e-439c-a06e-df5962717a15" containerName="oc" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.125226 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="851856fc-535e-439c-a06e-df5962717a15" containerName="oc" Mar 19 19:37:41 crc kubenswrapper[5033]: E0319 19:37:41.125243 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44718e74-5ed8-41f2-880d-2b72e10d8cb4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.125251 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="44718e74-5ed8-41f2-880d-2b72e10d8cb4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.125502 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="851856fc-535e-439c-a06e-df5962717a15" containerName="oc" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.125531 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="44718e74-5ed8-41f2-880d-2b72e10d8cb4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.126445 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.129265 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.129487 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.130338 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.130536 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.130924 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kzctb" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.135071 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f"] Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185113 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64pb7\" (UniqueName: \"kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185272 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185293 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.185416 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.287807 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288372 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288488 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64pb7\" (UniqueName: \"kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288633 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.288675 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.292129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.298189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.312763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.314152 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64pb7\" (UniqueName: \"kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.314654 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.318053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.319051 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s582f\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:41 crc kubenswrapper[5033]: I0319 19:37:41.449980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:37:42 crc kubenswrapper[5033]: I0319 19:37:42.003612 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f"] Mar 19 19:37:43 crc kubenswrapper[5033]: I0319 19:37:43.018321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" event={"ID":"57028763-d499-4e66-aaa7-52bbf97174d3","Type":"ContainerStarted","Data":"cd7c48092ca2e6e1193a2ab3772c9994044aee2fa09bc0c20e131a32012d1ab0"} Mar 19 19:37:43 crc kubenswrapper[5033]: I0319 19:37:43.018758 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" event={"ID":"57028763-d499-4e66-aaa7-52bbf97174d3","Type":"ContainerStarted","Data":"f3da0d7676ff963d7c623bf3e59d4cde18be757a6f5824c8c2cf8dd71d8dea3e"} Mar 19 19:37:43 crc kubenswrapper[5033]: I0319 19:37:43.049170 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" podStartSLOduration=1.654159446 podStartE2EDuration="2.049147115s" podCreationTimestamp="2026-03-19 19:37:41 +0000 UTC" firstStartedPulling="2026-03-19 19:37:42.010541709 +0000 UTC m=+2472.115571558" lastFinishedPulling="2026-03-19 19:37:42.405529378 +0000 UTC m=+2472.510559227" observedRunningTime="2026-03-19 19:37:43.039643335 +0000 UTC m=+2473.144673184" watchObservedRunningTime="2026-03-19 19:37:43.049147115 +0000 UTC m=+2473.154176974" Mar 19 19:37:53 crc kubenswrapper[5033]: I0319 19:37:53.621292 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:37:53 crc kubenswrapper[5033]: E0319 19:37:53.622011 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.141073 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565818-vjswh"] Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.143356 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.146795 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.146814 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.147230 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.161906 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-vjswh"] Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.215323 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvx2\" (UniqueName: \"kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2\") pod \"auto-csr-approver-29565818-vjswh\" (UID: \"8bb883c9-bc03-46e8-9578-6da2f13278ab\") " pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.317532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvx2\" (UniqueName: \"kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2\") pod \"auto-csr-approver-29565818-vjswh\" (UID: \"8bb883c9-bc03-46e8-9578-6da2f13278ab\") " pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.355567 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvx2\" (UniqueName: \"kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2\") pod \"auto-csr-approver-29565818-vjswh\" (UID: \"8bb883c9-bc03-46e8-9578-6da2f13278ab\") " pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.468726 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:00 crc kubenswrapper[5033]: I0319 19:38:00.924329 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-vjswh"] Mar 19 19:38:00 crc kubenswrapper[5033]: W0319 19:38:00.926348 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb883c9_bc03_46e8_9578_6da2f13278ab.slice/crio-55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88 WatchSource:0}: Error finding container 55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88: Status 404 returned error can't find the container with id 55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88 Mar 19 19:38:01 crc kubenswrapper[5033]: I0319 19:38:01.208180 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-vjswh" event={"ID":"8bb883c9-bc03-46e8-9578-6da2f13278ab","Type":"ContainerStarted","Data":"55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88"} Mar 19 19:38:03 crc kubenswrapper[5033]: I0319 19:38:03.228968 5033 generic.go:334] "Generic (PLEG): container finished" podID="8bb883c9-bc03-46e8-9578-6da2f13278ab" containerID="f0db6a69b63b8e7aa881c570f9021f09d11294c8880fd4fae08517b222fb12ca" exitCode=0 Mar 19 19:38:03 crc kubenswrapper[5033]: I0319 19:38:03.229046 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-vjswh" event={"ID":"8bb883c9-bc03-46e8-9578-6da2f13278ab","Type":"ContainerDied","Data":"f0db6a69b63b8e7aa881c570f9021f09d11294c8880fd4fae08517b222fb12ca"} Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:04.621363 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:38:05 crc kubenswrapper[5033]: E0319 19:38:04.621932 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:04.673578 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:04.818367 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvx2\" (UniqueName: \"kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2\") pod \"8bb883c9-bc03-46e8-9578-6da2f13278ab\" (UID: \"8bb883c9-bc03-46e8-9578-6da2f13278ab\") " Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:04.828293 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2" (OuterVolumeSpecName: "kube-api-access-2tvx2") pod "8bb883c9-bc03-46e8-9578-6da2f13278ab" (UID: "8bb883c9-bc03-46e8-9578-6da2f13278ab"). InnerVolumeSpecName "kube-api-access-2tvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:04.925506 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvx2\" (UniqueName: \"kubernetes.io/projected/8bb883c9-bc03-46e8-9578-6da2f13278ab-kube-api-access-2tvx2\") on node \"crc\" DevicePath \"\"" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:05.257240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-vjswh" event={"ID":"8bb883c9-bc03-46e8-9578-6da2f13278ab","Type":"ContainerDied","Data":"55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88"} Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:05.257313 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55bdb91549b63a0b2048d04bc9805e41ecc00dc9d6a6a964d1bff24bd6a54a88" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:05.257335 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-vjswh" Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:05.754010 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-vpvdv"] Mar 19 19:38:05 crc kubenswrapper[5033]: I0319 19:38:05.762622 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-vpvdv"] Mar 19 19:38:06 crc kubenswrapper[5033]: I0319 19:38:06.636073 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319a2279-ecbd-46a0-901f-b8dd08654a38" path="/var/lib/kubelet/pods/319a2279-ecbd-46a0-901f-b8dd08654a38/volumes" Mar 19 19:38:15 crc kubenswrapper[5033]: I0319 19:38:15.620324 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:38:15 crc kubenswrapper[5033]: E0319 19:38:15.621104 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:38:16 crc kubenswrapper[5033]: I0319 19:38:16.283253 5033 scope.go:117] "RemoveContainer" containerID="719fc800760b7d117b93f55d7c38d5282a2ac415435d4fd16c74a6da4aed323e" Mar 19 19:38:29 crc kubenswrapper[5033]: I0319 19:38:29.620285 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:38:29 crc kubenswrapper[5033]: E0319 19:38:29.621938 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:38:44 crc kubenswrapper[5033]: I0319 19:38:44.643243 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:38:44 crc kubenswrapper[5033]: E0319 19:38:44.645896 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:38:55 crc kubenswrapper[5033]: I0319 19:38:55.620380 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:38:55 crc kubenswrapper[5033]: E0319 19:38:55.621171 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:39:06 crc kubenswrapper[5033]: I0319 19:39:06.621204 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:39:06 crc kubenswrapper[5033]: E0319 19:39:06.623241 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:39:17 crc kubenswrapper[5033]: I0319 19:39:17.620397 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:39:17 crc kubenswrapper[5033]: E0319 19:39:17.621271 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:39:30 crc kubenswrapper[5033]: I0319 19:39:30.627522 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:39:30 crc kubenswrapper[5033]: E0319 19:39:30.628471 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:39:45 crc kubenswrapper[5033]: I0319 19:39:45.621078 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:39:45 crc kubenswrapper[5033]: E0319 19:39:45.621993 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:39:59 crc kubenswrapper[5033]: I0319 19:39:59.652078 5033 generic.go:334] "Generic (PLEG): container finished" podID="57028763-d499-4e66-aaa7-52bbf97174d3" containerID="cd7c48092ca2e6e1193a2ab3772c9994044aee2fa09bc0c20e131a32012d1ab0" exitCode=0 Mar 19 19:39:59 crc kubenswrapper[5033]: I0319 19:39:59.652154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" event={"ID":"57028763-d499-4e66-aaa7-52bbf97174d3","Type":"ContainerDied","Data":"cd7c48092ca2e6e1193a2ab3772c9994044aee2fa09bc0c20e131a32012d1ab0"} Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.148870 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565820-jj25g"] Mar 19 19:40:00 crc kubenswrapper[5033]: E0319 19:40:00.149411 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb883c9-bc03-46e8-9578-6da2f13278ab" containerName="oc" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.149434 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb883c9-bc03-46e8-9578-6da2f13278ab" containerName="oc" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.149745 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb883c9-bc03-46e8-9578-6da2f13278ab" containerName="oc" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.150578 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.152633 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.154043 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.154757 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.165542 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-jj25g"] Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.237627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4nl\" (UniqueName: \"kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl\") pod \"auto-csr-approver-29565820-jj25g\" (UID: \"901cdd28-c641-475f-a363-fa792b6e0fde\") " pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.339213 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4nl\" (UniqueName: \"kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl\") pod \"auto-csr-approver-29565820-jj25g\" (UID: \"901cdd28-c641-475f-a363-fa792b6e0fde\") " pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.357371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4nl\" (UniqueName: \"kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl\") pod \"auto-csr-approver-29565820-jj25g\" (UID: \"901cdd28-c641-475f-a363-fa792b6e0fde\") " pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.467644 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.656144 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:40:00 crc kubenswrapper[5033]: E0319 19:40:00.657615 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:40:00 crc kubenswrapper[5033]: I0319 19:40:00.924173 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-jj25g"] Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.101662 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.158648 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.158772 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64pb7\" (UniqueName: \"kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.158833 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.159210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.159335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.159373 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.159401 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0\") pod \"57028763-d499-4e66-aaa7-52bbf97174d3\" (UID: \"57028763-d499-4e66-aaa7-52bbf97174d3\") " Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.164784 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.165156 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7" (OuterVolumeSpecName: "kube-api-access-64pb7") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "kube-api-access-64pb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.188726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.190101 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.192691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.193072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory" (OuterVolumeSpecName: "inventory") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.198097 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "57028763-d499-4e66-aaa7-52bbf97174d3" (UID: "57028763-d499-4e66-aaa7-52bbf97174d3"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262600 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64pb7\" (UniqueName: \"kubernetes.io/projected/57028763-d499-4e66-aaa7-52bbf97174d3-kube-api-access-64pb7\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262682 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262699 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262712 5033 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262724 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262735 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.262746 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/57028763-d499-4e66-aaa7-52bbf97174d3-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.670116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-jj25g" event={"ID":"901cdd28-c641-475f-a363-fa792b6e0fde","Type":"ContainerStarted","Data":"677c89fdccff47b4f1ebbbde4a3c6b0cc2875d14ae04c55bfbd8a5839a433df0"} Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.671888 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" event={"ID":"57028763-d499-4e66-aaa7-52bbf97174d3","Type":"ContainerDied","Data":"f3da0d7676ff963d7c623bf3e59d4cde18be757a6f5824c8c2cf8dd71d8dea3e"} Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.671914 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3da0d7676ff963d7c623bf3e59d4cde18be757a6f5824c8c2cf8dd71d8dea3e" Mar 19 19:40:01 crc kubenswrapper[5033]: I0319 19:40:01.671967 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s582f" Mar 19 19:40:02 crc kubenswrapper[5033]: I0319 19:40:02.681538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-jj25g" event={"ID":"901cdd28-c641-475f-a363-fa792b6e0fde","Type":"ContainerStarted","Data":"936d75a8fafc6cb51bedd99d277eed128fba09664bb7f1531977ba952abb5efa"} Mar 19 19:40:02 crc kubenswrapper[5033]: I0319 19:40:02.702155 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565820-jj25g" podStartSLOduration=1.3761936989999999 podStartE2EDuration="2.702131873s" podCreationTimestamp="2026-03-19 19:40:00 +0000 UTC" firstStartedPulling="2026-03-19 19:40:00.935032007 +0000 UTC m=+2611.040061856" lastFinishedPulling="2026-03-19 19:40:02.260970181 +0000 UTC m=+2612.366000030" observedRunningTime="2026-03-19 19:40:02.697682456 +0000 UTC m=+2612.802712305" watchObservedRunningTime="2026-03-19 19:40:02.702131873 +0000 UTC m=+2612.807161712" Mar 19 19:40:03 crc kubenswrapper[5033]: I0319 19:40:03.691113 5033 generic.go:334] "Generic (PLEG): container finished" podID="901cdd28-c641-475f-a363-fa792b6e0fde" containerID="936d75a8fafc6cb51bedd99d277eed128fba09664bb7f1531977ba952abb5efa" exitCode=0 Mar 19 19:40:03 crc kubenswrapper[5033]: I0319 19:40:03.691217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-jj25g" event={"ID":"901cdd28-c641-475f-a363-fa792b6e0fde","Type":"ContainerDied","Data":"936d75a8fafc6cb51bedd99d277eed128fba09664bb7f1531977ba952abb5efa"} Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.127439 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.238796 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4nl\" (UniqueName: \"kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl\") pod \"901cdd28-c641-475f-a363-fa792b6e0fde\" (UID: \"901cdd28-c641-475f-a363-fa792b6e0fde\") " Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.245869 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl" (OuterVolumeSpecName: "kube-api-access-cr4nl") pod "901cdd28-c641-475f-a363-fa792b6e0fde" (UID: "901cdd28-c641-475f-a363-fa792b6e0fde"). InnerVolumeSpecName "kube-api-access-cr4nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.340706 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4nl\" (UniqueName: \"kubernetes.io/projected/901cdd28-c641-475f-a363-fa792b6e0fde-kube-api-access-cr4nl\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.716606 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-jj25g" event={"ID":"901cdd28-c641-475f-a363-fa792b6e0fde","Type":"ContainerDied","Data":"677c89fdccff47b4f1ebbbde4a3c6b0cc2875d14ae04c55bfbd8a5839a433df0"} Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.716900 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="677c89fdccff47b4f1ebbbde4a3c6b0cc2875d14ae04c55bfbd8a5839a433df0" Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.716969 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-jj25g" Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.766931 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-lwht5"] Mar 19 19:40:05 crc kubenswrapper[5033]: I0319 19:40:05.776003 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-lwht5"] Mar 19 19:40:06 crc kubenswrapper[5033]: I0319 19:40:06.635079 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b0e341-629b-43bd-a91d-b4e716e4aa5f" path="/var/lib/kubelet/pods/00b0e341-629b-43bd-a91d-b4e716e4aa5f/volumes" Mar 19 19:40:11 crc kubenswrapper[5033]: I0319 19:40:11.620563 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:40:11 crc kubenswrapper[5033]: E0319 19:40:11.623006 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:40:16 crc kubenswrapper[5033]: I0319 19:40:16.366435 5033 scope.go:117] "RemoveContainer" containerID="a28df8b72a3c925bce9a71678d2102496d59752b0bb7339a55fa71a7e4408a45" Mar 19 19:40:26 crc kubenswrapper[5033]: I0319 19:40:26.620363 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:40:26 crc kubenswrapper[5033]: E0319 19:40:26.621205 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:40:38 crc kubenswrapper[5033]: I0319 19:40:38.621344 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:40:38 crc kubenswrapper[5033]: E0319 19:40:38.622133 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:40:53 crc kubenswrapper[5033]: I0319 19:40:53.621163 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:40:53 crc kubenswrapper[5033]: E0319 19:40:53.622323 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:41:06 crc kubenswrapper[5033]: I0319 19:41:06.620331 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:41:06 crc kubenswrapper[5033]: E0319 19:41:06.621276 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.373711 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:10 crc kubenswrapper[5033]: E0319 19:41:10.375377 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57028763-d499-4e66-aaa7-52bbf97174d3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.377741 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="57028763-d499-4e66-aaa7-52bbf97174d3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:10 crc kubenswrapper[5033]: E0319 19:41:10.377973 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901cdd28-c641-475f-a363-fa792b6e0fde" containerName="oc" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.378004 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="901cdd28-c641-475f-a363-fa792b6e0fde" containerName="oc" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.379405 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="901cdd28-c641-475f-a363-fa792b6e0fde" containerName="oc" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.379610 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="57028763-d499-4e66-aaa7-52bbf97174d3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.383818 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.392989 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.477249 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4gxf\" (UniqueName: \"kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.477389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.477426 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.579142 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.579207 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.579335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4gxf\" (UniqueName: \"kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.579712 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.579834 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.601847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4gxf\" (UniqueName: \"kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf\") pod \"redhat-operators-84vx6\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:10 crc kubenswrapper[5033]: I0319 19:41:10.721123 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:11 crc kubenswrapper[5033]: I0319 19:41:11.242809 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:11 crc kubenswrapper[5033]: I0319 19:41:11.355692 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerStarted","Data":"5790214f76bf12cac36413c768e9468003fedc9b02b55febfb417bf64972edff"} Mar 19 19:41:12 crc kubenswrapper[5033]: I0319 19:41:12.366324 5033 generic.go:334] "Generic (PLEG): container finished" podID="d51b7922-7d81-4642-984c-149c6aad8878" containerID="776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23" exitCode=0 Mar 19 19:41:12 crc kubenswrapper[5033]: I0319 19:41:12.366402 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerDied","Data":"776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23"} Mar 19 19:41:12 crc kubenswrapper[5033]: I0319 19:41:12.369887 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:41:13 crc kubenswrapper[5033]: I0319 19:41:13.376432 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerStarted","Data":"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517"} Mar 19 19:41:18 crc kubenswrapper[5033]: I0319 19:41:18.438893 5033 generic.go:334] "Generic (PLEG): container finished" podID="d51b7922-7d81-4642-984c-149c6aad8878" containerID="27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517" exitCode=0 Mar 19 19:41:18 crc kubenswrapper[5033]: I0319 19:41:18.439011 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerDied","Data":"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517"} Mar 19 19:41:18 crc kubenswrapper[5033]: I0319 19:41:18.622558 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:41:18 crc kubenswrapper[5033]: E0319 19:41:18.623353 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:41:19 crc kubenswrapper[5033]: I0319 19:41:19.451965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerStarted","Data":"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0"} Mar 19 19:41:19 crc kubenswrapper[5033]: I0319 19:41:19.471394 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-84vx6" podStartSLOduration=2.905885418 podStartE2EDuration="9.471370965s" podCreationTimestamp="2026-03-19 19:41:10 +0000 UTC" firstStartedPulling="2026-03-19 19:41:12.369690255 +0000 UTC m=+2682.474720104" lastFinishedPulling="2026-03-19 19:41:18.935175792 +0000 UTC m=+2689.040205651" observedRunningTime="2026-03-19 19:41:19.467443804 +0000 UTC m=+2689.572473653" watchObservedRunningTime="2026-03-19 19:41:19.471370965 +0000 UTC m=+2689.576400834" Mar 19 19:41:20 crc kubenswrapper[5033]: E0319 19:41:20.352270 5033 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:54760->38.102.83.9:39573: write tcp 38.102.83.9:54760->38.102.83.9:39573: write: connection reset by peer Mar 19 19:41:20 crc kubenswrapper[5033]: I0319 19:41:20.721704 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:20 crc kubenswrapper[5033]: I0319 19:41:20.721754 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:21 crc kubenswrapper[5033]: I0319 19:41:21.772801 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-84vx6" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" probeResult="failure" output=< Mar 19 19:41:21 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:41:21 crc kubenswrapper[5033]: > Mar 19 19:41:29 crc kubenswrapper[5033]: I0319 19:41:29.620543 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:41:29 crc kubenswrapper[5033]: E0319 19:41:29.621271 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:41:31 crc kubenswrapper[5033]: I0319 19:41:31.766767 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-84vx6" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" probeResult="failure" output=< Mar 19 19:41:31 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:41:31 crc kubenswrapper[5033]: > Mar 19 19:41:40 crc kubenswrapper[5033]: I0319 19:41:40.766165 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:40 crc kubenswrapper[5033]: I0319 19:41:40.823623 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:41 crc kubenswrapper[5033]: I0319 19:41:41.582362 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:42 crc kubenswrapper[5033]: I0319 19:41:42.678089 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-84vx6" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" containerID="cri-o://2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0" gracePeriod=2 Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.263106 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.441797 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities\") pod \"d51b7922-7d81-4642-984c-149c6aad8878\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.441922 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4gxf\" (UniqueName: \"kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf\") pod \"d51b7922-7d81-4642-984c-149c6aad8878\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.442003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content\") pod \"d51b7922-7d81-4642-984c-149c6aad8878\" (UID: \"d51b7922-7d81-4642-984c-149c6aad8878\") " Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.442828 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities" (OuterVolumeSpecName: "utilities") pod "d51b7922-7d81-4642-984c-149c6aad8878" (UID: "d51b7922-7d81-4642-984c-149c6aad8878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.443141 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.447763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf" (OuterVolumeSpecName: "kube-api-access-t4gxf") pod "d51b7922-7d81-4642-984c-149c6aad8878" (UID: "d51b7922-7d81-4642-984c-149c6aad8878"). InnerVolumeSpecName "kube-api-access-t4gxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.545081 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4gxf\" (UniqueName: \"kubernetes.io/projected/d51b7922-7d81-4642-984c-149c6aad8878-kube-api-access-t4gxf\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.570095 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d51b7922-7d81-4642-984c-149c6aad8878" (UID: "d51b7922-7d81-4642-984c-149c6aad8878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.621393 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:41:43 crc kubenswrapper[5033]: E0319 19:41:43.621764 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.647308 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d51b7922-7d81-4642-984c-149c6aad8878-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.692763 5033 generic.go:334] "Generic (PLEG): container finished" podID="d51b7922-7d81-4642-984c-149c6aad8878" containerID="2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0" exitCode=0 Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.692816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerDied","Data":"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0"} Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.692863 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84vx6" event={"ID":"d51b7922-7d81-4642-984c-149c6aad8878","Type":"ContainerDied","Data":"5790214f76bf12cac36413c768e9468003fedc9b02b55febfb417bf64972edff"} Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.692870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84vx6" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.692883 5033 scope.go:117] "RemoveContainer" containerID="2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.729683 5033 scope.go:117] "RemoveContainer" containerID="27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.759598 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.768844 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-84vx6"] Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.772998 5033 scope.go:117] "RemoveContainer" containerID="776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.826520 5033 scope.go:117] "RemoveContainer" containerID="2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0" Mar 19 19:41:43 crc kubenswrapper[5033]: E0319 19:41:43.827042 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0\": container with ID starting with 2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0 not found: ID does not exist" containerID="2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.827108 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0"} err="failed to get container status \"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0\": rpc error: code = NotFound desc = could not find container \"2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0\": container with ID starting with 2b80dab152e8bfeb0c34745f82a13236890338f832b0e065c1019e81bf15a3a0 not found: ID does not exist" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.827147 5033 scope.go:117] "RemoveContainer" containerID="27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517" Mar 19 19:41:43 crc kubenswrapper[5033]: E0319 19:41:43.827687 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517\": container with ID starting with 27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517 not found: ID does not exist" containerID="27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.827762 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517"} err="failed to get container status \"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517\": rpc error: code = NotFound desc = could not find container \"27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517\": container with ID starting with 27daae09423cc07aafc373f7d2cdf75b672f0517a430e34aba5dad2e5aa19517 not found: ID does not exist" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.827792 5033 scope.go:117] "RemoveContainer" containerID="776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23" Mar 19 19:41:43 crc kubenswrapper[5033]: E0319 19:41:43.828126 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23\": container with ID starting with 776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23 not found: ID does not exist" containerID="776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23" Mar 19 19:41:43 crc kubenswrapper[5033]: I0319 19:41:43.828165 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23"} err="failed to get container status \"776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23\": rpc error: code = NotFound desc = could not find container \"776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23\": container with ID starting with 776c327826cf5786af78584bb85747042acdc5fbd1cf9f846dfc6b9545047c23 not found: ID does not exist" Mar 19 19:41:44 crc kubenswrapper[5033]: I0319 19:41:44.634737 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51b7922-7d81-4642-984c-149c6aad8878" path="/var/lib/kubelet/pods/d51b7922-7d81-4642-984c-149c6aad8878/volumes" Mar 19 19:41:57 crc kubenswrapper[5033]: I0319 19:41:57.621233 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:41:57 crc kubenswrapper[5033]: E0319 19:41:57.622424 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.150084 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565822-sbkw4"] Mar 19 19:42:00 crc kubenswrapper[5033]: E0319 19:42:00.150970 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="extract-content" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.150986 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="extract-content" Mar 19 19:42:00 crc kubenswrapper[5033]: E0319 19:42:00.151008 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.151016 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" Mar 19 19:42:00 crc kubenswrapper[5033]: E0319 19:42:00.151039 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="extract-utilities" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.151046 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="extract-utilities" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.151276 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51b7922-7d81-4642-984c-149c6aad8878" containerName="registry-server" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.152378 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.155051 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.155402 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.157091 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.166730 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-sbkw4"] Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.257698 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98brg\" (UniqueName: \"kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg\") pod \"auto-csr-approver-29565822-sbkw4\" (UID: \"233ac508-979f-4f6c-abde-e09932f355ab\") " pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.360278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98brg\" (UniqueName: \"kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg\") pod \"auto-csr-approver-29565822-sbkw4\" (UID: \"233ac508-979f-4f6c-abde-e09932f355ab\") " pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.384427 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98brg\" (UniqueName: \"kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg\") pod \"auto-csr-approver-29565822-sbkw4\" (UID: \"233ac508-979f-4f6c-abde-e09932f355ab\") " pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:00 crc kubenswrapper[5033]: I0319 19:42:00.484292 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:01 crc kubenswrapper[5033]: I0319 19:42:01.007587 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-sbkw4"] Mar 19 19:42:01 crc kubenswrapper[5033]: W0319 19:42:01.014489 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod233ac508_979f_4f6c_abde_e09932f355ab.slice/crio-67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608 WatchSource:0}: Error finding container 67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608: Status 404 returned error can't find the container with id 67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608 Mar 19 19:42:01 crc kubenswrapper[5033]: I0319 19:42:01.881623 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" event={"ID":"233ac508-979f-4f6c-abde-e09932f355ab","Type":"ContainerStarted","Data":"67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608"} Mar 19 19:42:02 crc kubenswrapper[5033]: I0319 19:42:02.895907 5033 generic.go:334] "Generic (PLEG): container finished" podID="233ac508-979f-4f6c-abde-e09932f355ab" containerID="986551d48629c59cb4394cb024cd56735ca0d71552f497aa836e706232f45c64" exitCode=0 Mar 19 19:42:02 crc kubenswrapper[5033]: I0319 19:42:02.896018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" event={"ID":"233ac508-979f-4f6c-abde-e09932f355ab","Type":"ContainerDied","Data":"986551d48629c59cb4394cb024cd56735ca0d71552f497aa836e706232f45c64"} Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.347539 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.472624 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98brg\" (UniqueName: \"kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg\") pod \"233ac508-979f-4f6c-abde-e09932f355ab\" (UID: \"233ac508-979f-4f6c-abde-e09932f355ab\") " Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.477825 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg" (OuterVolumeSpecName: "kube-api-access-98brg") pod "233ac508-979f-4f6c-abde-e09932f355ab" (UID: "233ac508-979f-4f6c-abde-e09932f355ab"). InnerVolumeSpecName "kube-api-access-98brg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.575072 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98brg\" (UniqueName: \"kubernetes.io/projected/233ac508-979f-4f6c-abde-e09932f355ab-kube-api-access-98brg\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.914918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" event={"ID":"233ac508-979f-4f6c-abde-e09932f355ab","Type":"ContainerDied","Data":"67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608"} Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.914966 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-sbkw4" Mar 19 19:42:04 crc kubenswrapper[5033]: I0319 19:42:04.914967 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ec526ed7eaae044eb1a0ec7aa25210bd4b1d55719269832f0a3c0966a21608" Mar 19 19:42:05 crc kubenswrapper[5033]: I0319 19:42:05.416714 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-tmd9q"] Mar 19 19:42:05 crc kubenswrapper[5033]: I0319 19:42:05.425243 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-tmd9q"] Mar 19 19:42:06 crc kubenswrapper[5033]: I0319 19:42:06.633567 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="851856fc-535e-439c-a06e-df5962717a15" path="/var/lib/kubelet/pods/851856fc-535e-439c-a06e-df5962717a15/volumes" Mar 19 19:42:10 crc kubenswrapper[5033]: I0319 19:42:10.631971 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:42:10 crc kubenswrapper[5033]: E0319 19:42:10.633687 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:42:16 crc kubenswrapper[5033]: I0319 19:42:16.469288 5033 scope.go:117] "RemoveContainer" containerID="f8deb9d56461cf262278de5ac87943ae059e3b464aaa0127790a9913ef6a1165" Mar 19 19:42:24 crc kubenswrapper[5033]: I0319 19:42:24.620624 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:42:24 crc kubenswrapper[5033]: E0319 19:42:24.621317 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.700991 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:34 crc kubenswrapper[5033]: E0319 19:42:34.702100 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233ac508-979f-4f6c-abde-e09932f355ab" containerName="oc" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.702116 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="233ac508-979f-4f6c-abde-e09932f355ab" containerName="oc" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.702343 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="233ac508-979f-4f6c-abde-e09932f355ab" containerName="oc" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.703831 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.751214 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.872782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.873136 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.873261 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l94kz\" (UniqueName: \"kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.974639 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.974712 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.975161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.975199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:34 crc kubenswrapper[5033]: I0319 19:42:34.975319 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l94kz\" (UniqueName: \"kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:35 crc kubenswrapper[5033]: I0319 19:42:34.998703 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l94kz\" (UniqueName: \"kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz\") pod \"community-operators-kppss\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:35 crc kubenswrapper[5033]: I0319 19:42:35.023928 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:35 crc kubenswrapper[5033]: I0319 19:42:35.639560 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:36 crc kubenswrapper[5033]: I0319 19:42:36.241363 5033 generic.go:334] "Generic (PLEG): container finished" podID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerID="1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6" exitCode=0 Mar 19 19:42:36 crc kubenswrapper[5033]: I0319 19:42:36.241786 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerDied","Data":"1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6"} Mar 19 19:42:36 crc kubenswrapper[5033]: I0319 19:42:36.242718 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerStarted","Data":"a8ebe87c03bd5eff7588099eb37f46ea7aa5a75fa9b7c3cf3d4a381b6951ba35"} Mar 19 19:42:38 crc kubenswrapper[5033]: I0319 19:42:38.263879 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerStarted","Data":"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869"} Mar 19 19:42:38 crc kubenswrapper[5033]: I0319 19:42:38.621477 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:42:38 crc kubenswrapper[5033]: E0319 19:42:38.621762 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:42:39 crc kubenswrapper[5033]: I0319 19:42:39.273932 5033 generic.go:334] "Generic (PLEG): container finished" podID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerID="6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869" exitCode=0 Mar 19 19:42:39 crc kubenswrapper[5033]: I0319 19:42:39.273989 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerDied","Data":"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869"} Mar 19 19:42:44 crc kubenswrapper[5033]: I0319 19:42:44.323958 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerStarted","Data":"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc"} Mar 19 19:42:44 crc kubenswrapper[5033]: I0319 19:42:44.344308 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kppss" podStartSLOduration=2.878072713 podStartE2EDuration="10.344290207s" podCreationTimestamp="2026-03-19 19:42:34 +0000 UTC" firstStartedPulling="2026-03-19 19:42:36.245668895 +0000 UTC m=+2766.350698754" lastFinishedPulling="2026-03-19 19:42:43.711886399 +0000 UTC m=+2773.816916248" observedRunningTime="2026-03-19 19:42:44.339310526 +0000 UTC m=+2774.444340375" watchObservedRunningTime="2026-03-19 19:42:44.344290207 +0000 UTC m=+2774.449320056" Mar 19 19:42:45 crc kubenswrapper[5033]: I0319 19:42:45.044521 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:45 crc kubenswrapper[5033]: I0319 19:42:45.044583 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:46 crc kubenswrapper[5033]: I0319 19:42:46.109069 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-kppss" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="registry-server" probeResult="failure" output=< Mar 19 19:42:46 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:42:46 crc kubenswrapper[5033]: > Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.256705 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.259649 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.266131 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.419852 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.419988 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bnq4\" (UniqueName: \"kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.420105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.521526 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.521673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bnq4\" (UniqueName: \"kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.521786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.522141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.522331 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.556161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bnq4\" (UniqueName: \"kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4\") pod \"certified-operators-52ms7\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:50 crc kubenswrapper[5033]: I0319 19:42:50.581138 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:42:51 crc kubenswrapper[5033]: I0319 19:42:51.075538 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:42:51 crc kubenswrapper[5033]: I0319 19:42:51.408343 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerID="3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a" exitCode=0 Mar 19 19:42:51 crc kubenswrapper[5033]: I0319 19:42:51.408538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerDied","Data":"3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a"} Mar 19 19:42:51 crc kubenswrapper[5033]: I0319 19:42:51.408705 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerStarted","Data":"720f003d0ad82a1e02a57c019271afb91b36e9f4c4b2e008cc2aa9f6b9dd4448"} Mar 19 19:42:51 crc kubenswrapper[5033]: I0319 19:42:51.620294 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:42:52 crc kubenswrapper[5033]: I0319 19:42:52.431067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6"} Mar 19 19:42:52 crc kubenswrapper[5033]: I0319 19:42:52.437268 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerStarted","Data":"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a"} Mar 19 19:42:54 crc kubenswrapper[5033]: I0319 19:42:54.465694 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerID="990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a" exitCode=0 Mar 19 19:42:54 crc kubenswrapper[5033]: I0319 19:42:54.465826 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerDied","Data":"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a"} Mar 19 19:42:55 crc kubenswrapper[5033]: I0319 19:42:55.095040 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:55 crc kubenswrapper[5033]: I0319 19:42:55.153999 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:55 crc kubenswrapper[5033]: I0319 19:42:55.496089 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerStarted","Data":"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60"} Mar 19 19:42:55 crc kubenswrapper[5033]: I0319 19:42:55.518807 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52ms7" podStartSLOduration=2.067030963 podStartE2EDuration="5.518781521s" podCreationTimestamp="2026-03-19 19:42:50 +0000 UTC" firstStartedPulling="2026-03-19 19:42:51.410632893 +0000 UTC m=+2781.515662742" lastFinishedPulling="2026-03-19 19:42:54.862383451 +0000 UTC m=+2784.967413300" observedRunningTime="2026-03-19 19:42:55.514355085 +0000 UTC m=+2785.619384944" watchObservedRunningTime="2026-03-19 19:42:55.518781521 +0000 UTC m=+2785.623811370" Mar 19 19:42:57 crc kubenswrapper[5033]: I0319 19:42:57.437840 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:57 crc kubenswrapper[5033]: I0319 19:42:57.438722 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kppss" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="registry-server" containerID="cri-o://5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc" gracePeriod=2 Mar 19 19:42:57 crc kubenswrapper[5033]: I0319 19:42:57.935225 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.087860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l94kz\" (UniqueName: \"kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz\") pod \"591e5b90-1142-4339-a4ec-ca840f4e75dc\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.088097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content\") pod \"591e5b90-1142-4339-a4ec-ca840f4e75dc\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.088156 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities\") pod \"591e5b90-1142-4339-a4ec-ca840f4e75dc\" (UID: \"591e5b90-1142-4339-a4ec-ca840f4e75dc\") " Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.089012 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities" (OuterVolumeSpecName: "utilities") pod "591e5b90-1142-4339-a4ec-ca840f4e75dc" (UID: "591e5b90-1142-4339-a4ec-ca840f4e75dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.094731 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz" (OuterVolumeSpecName: "kube-api-access-l94kz") pod "591e5b90-1142-4339-a4ec-ca840f4e75dc" (UID: "591e5b90-1142-4339-a4ec-ca840f4e75dc"). InnerVolumeSpecName "kube-api-access-l94kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.138432 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "591e5b90-1142-4339-a4ec-ca840f4e75dc" (UID: "591e5b90-1142-4339-a4ec-ca840f4e75dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.190041 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.190078 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591e5b90-1142-4339-a4ec-ca840f4e75dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.190089 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l94kz\" (UniqueName: \"kubernetes.io/projected/591e5b90-1142-4339-a4ec-ca840f4e75dc-kube-api-access-l94kz\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.528271 5033 generic.go:334] "Generic (PLEG): container finished" podID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerID="5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc" exitCode=0 Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.528365 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerDied","Data":"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc"} Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.529115 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kppss" event={"ID":"591e5b90-1142-4339-a4ec-ca840f4e75dc","Type":"ContainerDied","Data":"a8ebe87c03bd5eff7588099eb37f46ea7aa5a75fa9b7c3cf3d4a381b6951ba35"} Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.528404 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kppss" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.529161 5033 scope.go:117] "RemoveContainer" containerID="5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.547327 5033 scope.go:117] "RemoveContainer" containerID="6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.565327 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.576331 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kppss"] Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.582051 5033 scope.go:117] "RemoveContainer" containerID="1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.629558 5033 scope.go:117] "RemoveContainer" containerID="5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc" Mar 19 19:42:58 crc kubenswrapper[5033]: E0319 19:42:58.629965 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc\": container with ID starting with 5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc not found: ID does not exist" containerID="5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.630024 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc"} err="failed to get container status \"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc\": rpc error: code = NotFound desc = could not find container \"5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc\": container with ID starting with 5091dcfd7574025ef2b36d2c6a5ed410fd6b5dadf7658f828c4bda289f9b0ecc not found: ID does not exist" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.630051 5033 scope.go:117] "RemoveContainer" containerID="6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869" Mar 19 19:42:58 crc kubenswrapper[5033]: E0319 19:42:58.630361 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869\": container with ID starting with 6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869 not found: ID does not exist" containerID="6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.630392 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869"} err="failed to get container status \"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869\": rpc error: code = NotFound desc = could not find container \"6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869\": container with ID starting with 6ee071d387b507e01b9c5215bd9cfdcd78bf8c583de42a899a38df82e851a869 not found: ID does not exist" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.630414 5033 scope.go:117] "RemoveContainer" containerID="1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6" Mar 19 19:42:58 crc kubenswrapper[5033]: E0319 19:42:58.630735 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6\": container with ID starting with 1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6 not found: ID does not exist" containerID="1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.630753 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6"} err="failed to get container status \"1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6\": rpc error: code = NotFound desc = could not find container \"1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6\": container with ID starting with 1d86929ff27ccd791a5a2eba4d4d92a91ca1e6b67f1056bd2ac82a703ad5d0e6 not found: ID does not exist" Mar 19 19:42:58 crc kubenswrapper[5033]: I0319 19:42:58.633618 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" path="/var/lib/kubelet/pods/591e5b90-1142-4339-a4ec-ca840f4e75dc/volumes" Mar 19 19:43:00 crc kubenswrapper[5033]: I0319 19:43:00.582282 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:00 crc kubenswrapper[5033]: I0319 19:43:00.582769 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:00 crc kubenswrapper[5033]: I0319 19:43:00.634307 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:01 crc kubenswrapper[5033]: I0319 19:43:01.609122 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:01 crc kubenswrapper[5033]: I0319 19:43:01.844811 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:43:03 crc kubenswrapper[5033]: I0319 19:43:03.581125 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52ms7" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="registry-server" containerID="cri-o://dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60" gracePeriod=2 Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.128482 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.220209 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content\") pod \"f1e97b32-58ce-429a-bc37-5f7c41008e78\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.220359 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bnq4\" (UniqueName: \"kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4\") pod \"f1e97b32-58ce-429a-bc37-5f7c41008e78\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.220585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities\") pod \"f1e97b32-58ce-429a-bc37-5f7c41008e78\" (UID: \"f1e97b32-58ce-429a-bc37-5f7c41008e78\") " Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.221427 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities" (OuterVolumeSpecName: "utilities") pod "f1e97b32-58ce-429a-bc37-5f7c41008e78" (UID: "f1e97b32-58ce-429a-bc37-5f7c41008e78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.226196 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4" (OuterVolumeSpecName: "kube-api-access-5bnq4") pod "f1e97b32-58ce-429a-bc37-5f7c41008e78" (UID: "f1e97b32-58ce-429a-bc37-5f7c41008e78"). InnerVolumeSpecName "kube-api-access-5bnq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.277982 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1e97b32-58ce-429a-bc37-5f7c41008e78" (UID: "f1e97b32-58ce-429a-bc37-5f7c41008e78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.322878 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bnq4\" (UniqueName: \"kubernetes.io/projected/f1e97b32-58ce-429a-bc37-5f7c41008e78-kube-api-access-5bnq4\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.322925 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.322936 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e97b32-58ce-429a-bc37-5f7c41008e78-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.597272 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerID="dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60" exitCode=0 Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.597344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerDied","Data":"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60"} Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.597485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52ms7" event={"ID":"f1e97b32-58ce-429a-bc37-5f7c41008e78","Type":"ContainerDied","Data":"720f003d0ad82a1e02a57c019271afb91b36e9f4c4b2e008cc2aa9f6b9dd4448"} Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.597535 5033 scope.go:117] "RemoveContainer" containerID="dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.597621 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52ms7" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.616961 5033 scope.go:117] "RemoveContainer" containerID="990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.646062 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.655087 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-52ms7"] Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.660788 5033 scope.go:117] "RemoveContainer" containerID="3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.697671 5033 scope.go:117] "RemoveContainer" containerID="dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60" Mar 19 19:43:04 crc kubenswrapper[5033]: E0319 19:43:04.698297 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60\": container with ID starting with dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60 not found: ID does not exist" containerID="dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.698335 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60"} err="failed to get container status \"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60\": rpc error: code = NotFound desc = could not find container \"dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60\": container with ID starting with dd54c83c3a7a0f7b625e170c496f83b2a7b103a2aa005797df77cc00a57cba60 not found: ID does not exist" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.698360 5033 scope.go:117] "RemoveContainer" containerID="990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a" Mar 19 19:43:04 crc kubenswrapper[5033]: E0319 19:43:04.698696 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a\": container with ID starting with 990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a not found: ID does not exist" containerID="990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.698728 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a"} err="failed to get container status \"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a\": rpc error: code = NotFound desc = could not find container \"990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a\": container with ID starting with 990de7a1d1c1989cdde6c7784dbb507fb72ba652e01aa2b7d884b9ed3bc1815a not found: ID does not exist" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.698747 5033 scope.go:117] "RemoveContainer" containerID="3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a" Mar 19 19:43:04 crc kubenswrapper[5033]: E0319 19:43:04.699621 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a\": container with ID starting with 3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a not found: ID does not exist" containerID="3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a" Mar 19 19:43:04 crc kubenswrapper[5033]: I0319 19:43:04.699658 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a"} err="failed to get container status \"3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a\": rpc error: code = NotFound desc = could not find container \"3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a\": container with ID starting with 3e39bc4e2dee097a23770c31a0cc9ed0a6443caf70183e014526f08eb63c892a not found: ID does not exist" Mar 19 19:43:06 crc kubenswrapper[5033]: I0319 19:43:06.635617 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" path="/var/lib/kubelet/pods/f1e97b32-58ce-429a-bc37-5f7c41008e78/volumes" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.171336 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565824-l92rr"] Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172383 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172396 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172413 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172418 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172442 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172464 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172483 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172488 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172506 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172513 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[5033]: E0319 19:44:00.172520 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172526 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172699 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e97b32-58ce-429a-bc37-5f7c41008e78" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.172713 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="591e5b90-1142-4339-a4ec-ca840f4e75dc" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.173401 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.176834 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.176955 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.177164 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.192174 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-l92rr"] Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.326657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2s2d\" (UniqueName: \"kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d\") pod \"auto-csr-approver-29565824-l92rr\" (UID: \"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b\") " pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.428469 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2s2d\" (UniqueName: \"kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d\") pod \"auto-csr-approver-29565824-l92rr\" (UID: \"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b\") " pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.448581 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2s2d\" (UniqueName: \"kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d\") pod \"auto-csr-approver-29565824-l92rr\" (UID: \"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b\") " pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.511371 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:00 crc kubenswrapper[5033]: W0319 19:44:00.977906 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf2dd1b_44e9_442a_8f70_9ef6ff99230b.slice/crio-62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07 WatchSource:0}: Error finding container 62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07: Status 404 returned error can't find the container with id 62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07 Mar 19 19:44:00 crc kubenswrapper[5033]: I0319 19:44:00.981043 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-l92rr"] Mar 19 19:44:01 crc kubenswrapper[5033]: I0319 19:44:01.143745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-l92rr" event={"ID":"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b","Type":"ContainerStarted","Data":"62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07"} Mar 19 19:44:03 crc kubenswrapper[5033]: I0319 19:44:03.165441 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" containerID="80f9e2108121b537c5088939aaea09b200275ca4214cfeec23f2af76e1fbeea6" exitCode=0 Mar 19 19:44:03 crc kubenswrapper[5033]: I0319 19:44:03.165585 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-l92rr" event={"ID":"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b","Type":"ContainerDied","Data":"80f9e2108121b537c5088939aaea09b200275ca4214cfeec23f2af76e1fbeea6"} Mar 19 19:44:04 crc kubenswrapper[5033]: I0319 19:44:04.592891 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:04 crc kubenswrapper[5033]: I0319 19:44:04.731546 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2s2d\" (UniqueName: \"kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d\") pod \"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b\" (UID: \"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b\") " Mar 19 19:44:04 crc kubenswrapper[5033]: I0319 19:44:04.737852 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d" (OuterVolumeSpecName: "kube-api-access-g2s2d") pod "dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" (UID: "dbf2dd1b-44e9-442a-8f70-9ef6ff99230b"). InnerVolumeSpecName "kube-api-access-g2s2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:44:04 crc kubenswrapper[5033]: I0319 19:44:04.835212 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2s2d\" (UniqueName: \"kubernetes.io/projected/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b-kube-api-access-g2s2d\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:05 crc kubenswrapper[5033]: I0319 19:44:05.190333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-l92rr" event={"ID":"dbf2dd1b-44e9-442a-8f70-9ef6ff99230b","Type":"ContainerDied","Data":"62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07"} Mar 19 19:44:05 crc kubenswrapper[5033]: I0319 19:44:05.190382 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62d269d2b9d475a22e2c5bcce9fe1ed1ffcda7a2d6fda94a9020a3d2e8953e07" Mar 19 19:44:05 crc kubenswrapper[5033]: I0319 19:44:05.190489 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-l92rr" Mar 19 19:44:05 crc kubenswrapper[5033]: I0319 19:44:05.665476 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-vjswh"] Mar 19 19:44:05 crc kubenswrapper[5033]: I0319 19:44:05.677979 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-vjswh"] Mar 19 19:44:06 crc kubenswrapper[5033]: I0319 19:44:06.630752 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb883c9-bc03-46e8-9578-6da2f13278ab" path="/var/lib/kubelet/pods/8bb883c9-bc03-46e8-9578-6da2f13278ab/volumes" Mar 19 19:44:16 crc kubenswrapper[5033]: I0319 19:44:16.623698 5033 scope.go:117] "RemoveContainer" containerID="f0db6a69b63b8e7aa881c570f9021f09d11294c8880fd4fae08517b222fb12ca" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.020358 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:26 crc kubenswrapper[5033]: E0319 19:44:26.021545 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" containerName="oc" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.021564 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" containerName="oc" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.021809 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" containerName="oc" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.023769 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.034639 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.131158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.131561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f57b\" (UniqueName: \"kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.131680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.233090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.233203 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f57b\" (UniqueName: \"kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.233266 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.233689 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.233705 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.257503 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f57b\" (UniqueName: \"kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b\") pod \"redhat-marketplace-gdg7z\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.350792 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:26 crc kubenswrapper[5033]: I0319 19:44:26.862365 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:27 crc kubenswrapper[5033]: I0319 19:44:27.404386 5033 generic.go:334] "Generic (PLEG): container finished" podID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerID="1f899451c235f50517a105c9973d314319ce20edeb152ce40e40c11ac18457c8" exitCode=0 Mar 19 19:44:27 crc kubenswrapper[5033]: I0319 19:44:27.404481 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerDied","Data":"1f899451c235f50517a105c9973d314319ce20edeb152ce40e40c11ac18457c8"} Mar 19 19:44:27 crc kubenswrapper[5033]: I0319 19:44:27.404675 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerStarted","Data":"fdd7214f931f9a27e6330793a45d206ad5b2fa12ccb0e7fcf1fe6b6d2d896d61"} Mar 19 19:44:29 crc kubenswrapper[5033]: I0319 19:44:29.430281 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerStarted","Data":"1efda9a0dc34e3df8e9778a4263b83421e39e30a85db79899c50b5298a53de8b"} Mar 19 19:44:30 crc kubenswrapper[5033]: I0319 19:44:30.443436 5033 generic.go:334] "Generic (PLEG): container finished" podID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerID="1efda9a0dc34e3df8e9778a4263b83421e39e30a85db79899c50b5298a53de8b" exitCode=0 Mar 19 19:44:30 crc kubenswrapper[5033]: I0319 19:44:30.443485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerDied","Data":"1efda9a0dc34e3df8e9778a4263b83421e39e30a85db79899c50b5298a53de8b"} Mar 19 19:44:31 crc kubenswrapper[5033]: I0319 19:44:31.457982 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerStarted","Data":"b4fe435a7fd132fe1d089f14dab7ec9e9e24b5cd78f60a54f7e099d543aedcff"} Mar 19 19:44:31 crc kubenswrapper[5033]: I0319 19:44:31.479916 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gdg7z" podStartSLOduration=3.019260853 podStartE2EDuration="6.479896739s" podCreationTimestamp="2026-03-19 19:44:25 +0000 UTC" firstStartedPulling="2026-03-19 19:44:27.405883875 +0000 UTC m=+2877.510913724" lastFinishedPulling="2026-03-19 19:44:30.866519751 +0000 UTC m=+2880.971549610" observedRunningTime="2026-03-19 19:44:31.473275972 +0000 UTC m=+2881.578305821" watchObservedRunningTime="2026-03-19 19:44:31.479896739 +0000 UTC m=+2881.584926578" Mar 19 19:44:36 crc kubenswrapper[5033]: I0319 19:44:36.351033 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:36 crc kubenswrapper[5033]: I0319 19:44:36.351590 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:36 crc kubenswrapper[5033]: I0319 19:44:36.397759 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:36 crc kubenswrapper[5033]: I0319 19:44:36.556754 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:36 crc kubenswrapper[5033]: I0319 19:44:36.636590 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:38 crc kubenswrapper[5033]: I0319 19:44:38.536571 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gdg7z" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="registry-server" containerID="cri-o://b4fe435a7fd132fe1d089f14dab7ec9e9e24b5cd78f60a54f7e099d543aedcff" gracePeriod=2 Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.550460 5033 generic.go:334] "Generic (PLEG): container finished" podID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerID="b4fe435a7fd132fe1d089f14dab7ec9e9e24b5cd78f60a54f7e099d543aedcff" exitCode=0 Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.550524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerDied","Data":"b4fe435a7fd132fe1d089f14dab7ec9e9e24b5cd78f60a54f7e099d543aedcff"} Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.550827 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gdg7z" event={"ID":"7ac57b83-639b-4ebc-b226-9ffb56610446","Type":"ContainerDied","Data":"fdd7214f931f9a27e6330793a45d206ad5b2fa12ccb0e7fcf1fe6b6d2d896d61"} Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.550842 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd7214f931f9a27e6330793a45d206ad5b2fa12ccb0e7fcf1fe6b6d2d896d61" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.563923 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.742727 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content\") pod \"7ac57b83-639b-4ebc-b226-9ffb56610446\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.743048 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities\") pod \"7ac57b83-639b-4ebc-b226-9ffb56610446\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.743539 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f57b\" (UniqueName: \"kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b\") pod \"7ac57b83-639b-4ebc-b226-9ffb56610446\" (UID: \"7ac57b83-639b-4ebc-b226-9ffb56610446\") " Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.744504 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities" (OuterVolumeSpecName: "utilities") pod "7ac57b83-639b-4ebc-b226-9ffb56610446" (UID: "7ac57b83-639b-4ebc-b226-9ffb56610446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.746823 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.750924 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b" (OuterVolumeSpecName: "kube-api-access-8f57b") pod "7ac57b83-639b-4ebc-b226-9ffb56610446" (UID: "7ac57b83-639b-4ebc-b226-9ffb56610446"). InnerVolumeSpecName "kube-api-access-8f57b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.768713 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ac57b83-639b-4ebc-b226-9ffb56610446" (UID: "7ac57b83-639b-4ebc-b226-9ffb56610446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.848937 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac57b83-639b-4ebc-b226-9ffb56610446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:39 crc kubenswrapper[5033]: I0319 19:44:39.848967 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f57b\" (UniqueName: \"kubernetes.io/projected/7ac57b83-639b-4ebc-b226-9ffb56610446-kube-api-access-8f57b\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:40 crc kubenswrapper[5033]: I0319 19:44:40.561317 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gdg7z" Mar 19 19:44:40 crc kubenswrapper[5033]: I0319 19:44:40.601347 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:40 crc kubenswrapper[5033]: I0319 19:44:40.610199 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gdg7z"] Mar 19 19:44:40 crc kubenswrapper[5033]: I0319 19:44:40.631199 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" path="/var/lib/kubelet/pods/7ac57b83-639b-4ebc-b226-9ffb56610446/volumes" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.148289 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc"] Mar 19 19:45:00 crc kubenswrapper[5033]: E0319 19:45:00.149238 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="registry-server" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.149252 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="registry-server" Mar 19 19:45:00 crc kubenswrapper[5033]: E0319 19:45:00.149265 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="extract-content" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.149270 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="extract-content" Mar 19 19:45:00 crc kubenswrapper[5033]: E0319 19:45:00.149293 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="extract-utilities" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.149299 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="extract-utilities" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.149545 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac57b83-639b-4ebc-b226-9ffb56610446" containerName="registry-server" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.150320 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.152230 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.152385 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.157164 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc"] Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.230805 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.230964 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.231009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkrc\" (UniqueName: \"kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.333461 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.333584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.333634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkrc\" (UniqueName: \"kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.334524 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.339415 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.359355 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkrc\" (UniqueName: \"kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc\") pod \"collect-profiles-29565825-2p6gc\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:00 crc kubenswrapper[5033]: I0319 19:45:00.485743 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:01 crc kubenswrapper[5033]: I0319 19:45:01.005688 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc"] Mar 19 19:45:01 crc kubenswrapper[5033]: I0319 19:45:01.781532 5033 generic.go:334] "Generic (PLEG): container finished" podID="756ad68c-ccdb-41db-80bc-923f130791d6" containerID="59f8423300628e3af197a9bcb4e7b30069e04ca5b7b330029b455075319acbcc" exitCode=0 Mar 19 19:45:01 crc kubenswrapper[5033]: I0319 19:45:01.781716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" event={"ID":"756ad68c-ccdb-41db-80bc-923f130791d6","Type":"ContainerDied","Data":"59f8423300628e3af197a9bcb4e7b30069e04ca5b7b330029b455075319acbcc"} Mar 19 19:45:01 crc kubenswrapper[5033]: I0319 19:45:01.781864 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" event={"ID":"756ad68c-ccdb-41db-80bc-923f130791d6","Type":"ContainerStarted","Data":"d4b256faa15661751d17b346698174b88072d5d8115b99a2b44bc57789a44383"} Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.214650 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.401076 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume\") pod \"756ad68c-ccdb-41db-80bc-923f130791d6\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.401495 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume\") pod \"756ad68c-ccdb-41db-80bc-923f130791d6\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.401700 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkrc\" (UniqueName: \"kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc\") pod \"756ad68c-ccdb-41db-80bc-923f130791d6\" (UID: \"756ad68c-ccdb-41db-80bc-923f130791d6\") " Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.402210 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "756ad68c-ccdb-41db-80bc-923f130791d6" (UID: "756ad68c-ccdb-41db-80bc-923f130791d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.402564 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/756ad68c-ccdb-41db-80bc-923f130791d6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.407375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc" (OuterVolumeSpecName: "kube-api-access-9nkrc") pod "756ad68c-ccdb-41db-80bc-923f130791d6" (UID: "756ad68c-ccdb-41db-80bc-923f130791d6"). InnerVolumeSpecName "kube-api-access-9nkrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.408072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "756ad68c-ccdb-41db-80bc-923f130791d6" (UID: "756ad68c-ccdb-41db-80bc-923f130791d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.504467 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/756ad68c-ccdb-41db-80bc-923f130791d6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.504767 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkrc\" (UniqueName: \"kubernetes.io/projected/756ad68c-ccdb-41db-80bc-923f130791d6-kube-api-access-9nkrc\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.804478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" event={"ID":"756ad68c-ccdb-41db-80bc-923f130791d6","Type":"ContainerDied","Data":"d4b256faa15661751d17b346698174b88072d5d8115b99a2b44bc57789a44383"} Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.804516 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b256faa15661751d17b346698174b88072d5d8115b99a2b44bc57789a44383" Mar 19 19:45:03 crc kubenswrapper[5033]: I0319 19:45:03.804563 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc" Mar 19 19:45:04 crc kubenswrapper[5033]: I0319 19:45:04.296216 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc"] Mar 19 19:45:04 crc kubenswrapper[5033]: I0319 19:45:04.305431 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-p8glc"] Mar 19 19:45:04 crc kubenswrapper[5033]: I0319 19:45:04.631501 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d8ddb8-2793-4307-9fbc-327e3ee978dd" path="/var/lib/kubelet/pods/63d8ddb8-2793-4307-9fbc-327e3ee978dd/volumes" Mar 19 19:45:10 crc kubenswrapper[5033]: I0319 19:45:10.758772 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:45:10 crc kubenswrapper[5033]: I0319 19:45:10.759269 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:45:16 crc kubenswrapper[5033]: I0319 19:45:16.699618 5033 scope.go:117] "RemoveContainer" containerID="a8f4fc1bf21b28fa40268c9b4514f791e54eb85644577dd8122ba75f8e9ff28e" Mar 19 19:45:40 crc kubenswrapper[5033]: I0319 19:45:40.768358 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:45:40 crc kubenswrapper[5033]: I0319 19:45:40.769044 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.154065 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565826-fg6wn"] Mar 19 19:46:00 crc kubenswrapper[5033]: E0319 19:46:00.155231 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756ad68c-ccdb-41db-80bc-923f130791d6" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.155249 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="756ad68c-ccdb-41db-80bc-923f130791d6" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.155530 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="756ad68c-ccdb-41db-80bc-923f130791d6" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.156536 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.159585 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.159999 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.160189 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.167059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-fg6wn"] Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.341418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfwp\" (UniqueName: \"kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp\") pod \"auto-csr-approver-29565826-fg6wn\" (UID: \"6bb134de-009b-472e-9c57-b39f7b1fa5ec\") " pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.443961 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbfwp\" (UniqueName: \"kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp\") pod \"auto-csr-approver-29565826-fg6wn\" (UID: \"6bb134de-009b-472e-9c57-b39f7b1fa5ec\") " pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.467337 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbfwp\" (UniqueName: \"kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp\") pod \"auto-csr-approver-29565826-fg6wn\" (UID: \"6bb134de-009b-472e-9c57-b39f7b1fa5ec\") " pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.484777 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:00 crc kubenswrapper[5033]: I0319 19:46:00.971303 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-fg6wn"] Mar 19 19:46:01 crc kubenswrapper[5033]: I0319 19:46:01.373635 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" event={"ID":"6bb134de-009b-472e-9c57-b39f7b1fa5ec","Type":"ContainerStarted","Data":"f3bd655e5942a8f57e7dd52b665ae46372ce02d5b20f349a37fffb10c251de66"} Mar 19 19:46:03 crc kubenswrapper[5033]: I0319 19:46:03.404634 5033 generic.go:334] "Generic (PLEG): container finished" podID="6bb134de-009b-472e-9c57-b39f7b1fa5ec" containerID="2f89c2e06e868dbdbc544812487ac4a8aa35ce4986d788123602210b15ae0ba9" exitCode=0 Mar 19 19:46:03 crc kubenswrapper[5033]: I0319 19:46:03.404716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" event={"ID":"6bb134de-009b-472e-9c57-b39f7b1fa5ec","Type":"ContainerDied","Data":"2f89c2e06e868dbdbc544812487ac4a8aa35ce4986d788123602210b15ae0ba9"} Mar 19 19:46:04 crc kubenswrapper[5033]: I0319 19:46:04.814846 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:04 crc kubenswrapper[5033]: I0319 19:46:04.944734 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbfwp\" (UniqueName: \"kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp\") pod \"6bb134de-009b-472e-9c57-b39f7b1fa5ec\" (UID: \"6bb134de-009b-472e-9c57-b39f7b1fa5ec\") " Mar 19 19:46:04 crc kubenswrapper[5033]: I0319 19:46:04.951393 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp" (OuterVolumeSpecName: "kube-api-access-wbfwp") pod "6bb134de-009b-472e-9c57-b39f7b1fa5ec" (UID: "6bb134de-009b-472e-9c57-b39f7b1fa5ec"). InnerVolumeSpecName "kube-api-access-wbfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.046637 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbfwp\" (UniqueName: \"kubernetes.io/projected/6bb134de-009b-472e-9c57-b39f7b1fa5ec-kube-api-access-wbfwp\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.422244 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" event={"ID":"6bb134de-009b-472e-9c57-b39f7b1fa5ec","Type":"ContainerDied","Data":"f3bd655e5942a8f57e7dd52b665ae46372ce02d5b20f349a37fffb10c251de66"} Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.422284 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3bd655e5942a8f57e7dd52b665ae46372ce02d5b20f349a37fffb10c251de66" Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.422313 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-fg6wn" Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.914534 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-jj25g"] Mar 19 19:46:05 crc kubenswrapper[5033]: I0319 19:46:05.932338 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-jj25g"] Mar 19 19:46:06 crc kubenswrapper[5033]: I0319 19:46:06.632866 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901cdd28-c641-475f-a363-fa792b6e0fde" path="/var/lib/kubelet/pods/901cdd28-c641-475f-a363-fa792b6e0fde/volumes" Mar 19 19:46:10 crc kubenswrapper[5033]: I0319 19:46:10.759427 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:46:10 crc kubenswrapper[5033]: I0319 19:46:10.760048 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:46:10 crc kubenswrapper[5033]: I0319 19:46:10.760114 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:46:10 crc kubenswrapper[5033]: I0319 19:46:10.761115 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:46:10 crc kubenswrapper[5033]: I0319 19:46:10.761176 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6" gracePeriod=600 Mar 19 19:46:11 crc kubenswrapper[5033]: I0319 19:46:11.501933 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6" exitCode=0 Mar 19 19:46:11 crc kubenswrapper[5033]: I0319 19:46:11.502002 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6"} Mar 19 19:46:11 crc kubenswrapper[5033]: I0319 19:46:11.502230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1"} Mar 19 19:46:11 crc kubenswrapper[5033]: I0319 19:46:11.502257 5033 scope.go:117] "RemoveContainer" containerID="ad04db2db5513ac36478024f140ce3b87132378d06a5edcf63d4f2dcac407ba0" Mar 19 19:46:16 crc kubenswrapper[5033]: I0319 19:46:16.804337 5033 scope.go:117] "RemoveContainer" containerID="936d75a8fafc6cb51bedd99d277eed128fba09664bb7f1531977ba952abb5efa" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.157760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565828-wksld"] Mar 19 19:48:00 crc kubenswrapper[5033]: E0319 19:48:00.158849 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb134de-009b-472e-9c57-b39f7b1fa5ec" containerName="oc" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.158866 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb134de-009b-472e-9c57-b39f7b1fa5ec" containerName="oc" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.159144 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb134de-009b-472e-9c57-b39f7b1fa5ec" containerName="oc" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.160205 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.162919 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.163527 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.164228 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.174236 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-wksld"] Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.303974 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6zd\" (UniqueName: \"kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd\") pod \"auto-csr-approver-29565828-wksld\" (UID: \"3499873d-d658-4518-a7a0-856060368e78\") " pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.405923 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6zd\" (UniqueName: \"kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd\") pod \"auto-csr-approver-29565828-wksld\" (UID: \"3499873d-d658-4518-a7a0-856060368e78\") " pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.428833 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6zd\" (UniqueName: \"kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd\") pod \"auto-csr-approver-29565828-wksld\" (UID: \"3499873d-d658-4518-a7a0-856060368e78\") " pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.479200 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.931316 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-wksld"] Mar 19 19:48:00 crc kubenswrapper[5033]: I0319 19:48:00.935973 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:48:01 crc kubenswrapper[5033]: I0319 19:48:01.600321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-wksld" event={"ID":"3499873d-d658-4518-a7a0-856060368e78","Type":"ContainerStarted","Data":"9989dbb4444d59e5ebba45c0f512c327e16aabef07cea1266772eba32564818d"} Mar 19 19:48:02 crc kubenswrapper[5033]: I0319 19:48:02.612296 5033 generic.go:334] "Generic (PLEG): container finished" podID="3499873d-d658-4518-a7a0-856060368e78" containerID="76af51119966991733dc50fb61c7517eda67919960948e30f77c3e2ea2f7e912" exitCode=0 Mar 19 19:48:02 crc kubenswrapper[5033]: I0319 19:48:02.612332 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-wksld" event={"ID":"3499873d-d658-4518-a7a0-856060368e78","Type":"ContainerDied","Data":"76af51119966991733dc50fb61c7517eda67919960948e30f77c3e2ea2f7e912"} Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.047383 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.202802 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s6zd\" (UniqueName: \"kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd\") pod \"3499873d-d658-4518-a7a0-856060368e78\" (UID: \"3499873d-d658-4518-a7a0-856060368e78\") " Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.219750 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd" (OuterVolumeSpecName: "kube-api-access-8s6zd") pod "3499873d-d658-4518-a7a0-856060368e78" (UID: "3499873d-d658-4518-a7a0-856060368e78"). InnerVolumeSpecName "kube-api-access-8s6zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.305803 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s6zd\" (UniqueName: \"kubernetes.io/projected/3499873d-d658-4518-a7a0-856060368e78-kube-api-access-8s6zd\") on node \"crc\" DevicePath \"\"" Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.634764 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-wksld" Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.639879 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-wksld" event={"ID":"3499873d-d658-4518-a7a0-856060368e78","Type":"ContainerDied","Data":"9989dbb4444d59e5ebba45c0f512c327e16aabef07cea1266772eba32564818d"} Mar 19 19:48:04 crc kubenswrapper[5033]: I0319 19:48:04.639961 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9989dbb4444d59e5ebba45c0f512c327e16aabef07cea1266772eba32564818d" Mar 19 19:48:05 crc kubenswrapper[5033]: I0319 19:48:05.136860 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-sbkw4"] Mar 19 19:48:05 crc kubenswrapper[5033]: I0319 19:48:05.145389 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-sbkw4"] Mar 19 19:48:06 crc kubenswrapper[5033]: I0319 19:48:06.634251 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233ac508-979f-4f6c-abde-e09932f355ab" path="/var/lib/kubelet/pods/233ac508-979f-4f6c-abde-e09932f355ab/volumes" Mar 19 19:48:16 crc kubenswrapper[5033]: I0319 19:48:16.897621 5033 scope.go:117] "RemoveContainer" containerID="986551d48629c59cb4394cb024cd56735ca0d71552f497aa836e706232f45c64" Mar 19 19:48:40 crc kubenswrapper[5033]: I0319 19:48:40.758832 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:48:40 crc kubenswrapper[5033]: I0319 19:48:40.759367 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:49:10 crc kubenswrapper[5033]: I0319 19:49:10.759377 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:49:10 crc kubenswrapper[5033]: I0319 19:49:10.760026 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:49:40 crc kubenswrapper[5033]: I0319 19:49:40.758817 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:49:40 crc kubenswrapper[5033]: I0319 19:49:40.759371 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:49:40 crc kubenswrapper[5033]: I0319 19:49:40.759420 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:49:40 crc kubenswrapper[5033]: I0319 19:49:40.760187 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:49:40 crc kubenswrapper[5033]: I0319 19:49:40.760241 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" gracePeriod=600 Mar 19 19:49:40 crc kubenswrapper[5033]: E0319 19:49:40.892889 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:49:41 crc kubenswrapper[5033]: I0319 19:49:41.703332 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" exitCode=0 Mar 19 19:49:41 crc kubenswrapper[5033]: I0319 19:49:41.703378 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1"} Mar 19 19:49:41 crc kubenswrapper[5033]: I0319 19:49:41.703681 5033 scope.go:117] "RemoveContainer" containerID="de96e93883a12cf4fc118f147037a03707e50c08e513bf3c07751594d90580c6" Mar 19 19:49:41 crc kubenswrapper[5033]: I0319 19:49:41.704616 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:49:41 crc kubenswrapper[5033]: E0319 19:49:41.705208 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:49:52 crc kubenswrapper[5033]: I0319 19:49:52.620326 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:49:52 crc kubenswrapper[5033]: E0319 19:49:52.621003 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.181909 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565830-tks4t"] Mar 19 19:50:00 crc kubenswrapper[5033]: E0319 19:50:00.182921 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3499873d-d658-4518-a7a0-856060368e78" containerName="oc" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.182936 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3499873d-d658-4518-a7a0-856060368e78" containerName="oc" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.183166 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3499873d-d658-4518-a7a0-856060368e78" containerName="oc" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.184037 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.188366 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.188400 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.191140 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-tks4t"] Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.194322 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.349596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rxhg\" (UniqueName: \"kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg\") pod \"auto-csr-approver-29565830-tks4t\" (UID: \"eab0825a-c5eb-42df-9dcc-15e4f6686e42\") " pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.452010 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rxhg\" (UniqueName: \"kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg\") pod \"auto-csr-approver-29565830-tks4t\" (UID: \"eab0825a-c5eb-42df-9dcc-15e4f6686e42\") " pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.476709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rxhg\" (UniqueName: \"kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg\") pod \"auto-csr-approver-29565830-tks4t\" (UID: \"eab0825a-c5eb-42df-9dcc-15e4f6686e42\") " pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.503684 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:00 crc kubenswrapper[5033]: I0319 19:50:00.975416 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-tks4t"] Mar 19 19:50:01 crc kubenswrapper[5033]: I0319 19:50:01.921558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-tks4t" event={"ID":"eab0825a-c5eb-42df-9dcc-15e4f6686e42","Type":"ContainerStarted","Data":"f13778932c9363765f2f46f86a85860832e3f9ef86d11b90eb42e141035f5d83"} Mar 19 19:50:02 crc kubenswrapper[5033]: I0319 19:50:02.932672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-tks4t" event={"ID":"eab0825a-c5eb-42df-9dcc-15e4f6686e42","Type":"ContainerStarted","Data":"1ae27b480fe2169bd8c8bd6c8a867040c697e4849027866b73ee2c2c4dcf5d4e"} Mar 19 19:50:02 crc kubenswrapper[5033]: I0319 19:50:02.953969 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565830-tks4t" podStartSLOduration=1.351088692 podStartE2EDuration="2.953953132s" podCreationTimestamp="2026-03-19 19:50:00 +0000 UTC" firstStartedPulling="2026-03-19 19:50:00.981214026 +0000 UTC m=+3211.086243875" lastFinishedPulling="2026-03-19 19:50:02.584078466 +0000 UTC m=+3212.689108315" observedRunningTime="2026-03-19 19:50:02.947663455 +0000 UTC m=+3213.052693304" watchObservedRunningTime="2026-03-19 19:50:02.953953132 +0000 UTC m=+3213.058982981" Mar 19 19:50:03 crc kubenswrapper[5033]: I0319 19:50:03.947025 5033 generic.go:334] "Generic (PLEG): container finished" podID="eab0825a-c5eb-42df-9dcc-15e4f6686e42" containerID="1ae27b480fe2169bd8c8bd6c8a867040c697e4849027866b73ee2c2c4dcf5d4e" exitCode=0 Mar 19 19:50:03 crc kubenswrapper[5033]: I0319 19:50:03.947106 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-tks4t" event={"ID":"eab0825a-c5eb-42df-9dcc-15e4f6686e42","Type":"ContainerDied","Data":"1ae27b480fe2169bd8c8bd6c8a867040c697e4849027866b73ee2c2c4dcf5d4e"} Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.290104 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.453986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rxhg\" (UniqueName: \"kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg\") pod \"eab0825a-c5eb-42df-9dcc-15e4f6686e42\" (UID: \"eab0825a-c5eb-42df-9dcc-15e4f6686e42\") " Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.460209 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg" (OuterVolumeSpecName: "kube-api-access-2rxhg") pod "eab0825a-c5eb-42df-9dcc-15e4f6686e42" (UID: "eab0825a-c5eb-42df-9dcc-15e4f6686e42"). InnerVolumeSpecName "kube-api-access-2rxhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.556858 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rxhg\" (UniqueName: \"kubernetes.io/projected/eab0825a-c5eb-42df-9dcc-15e4f6686e42-kube-api-access-2rxhg\") on node \"crc\" DevicePath \"\"" Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.966295 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-tks4t" event={"ID":"eab0825a-c5eb-42df-9dcc-15e4f6686e42","Type":"ContainerDied","Data":"f13778932c9363765f2f46f86a85860832e3f9ef86d11b90eb42e141035f5d83"} Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.966334 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13778932c9363765f2f46f86a85860832e3f9ef86d11b90eb42e141035f5d83" Mar 19 19:50:05 crc kubenswrapper[5033]: I0319 19:50:05.966355 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-tks4t" Mar 19 19:50:06 crc kubenswrapper[5033]: I0319 19:50:06.027731 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-l92rr"] Mar 19 19:50:06 crc kubenswrapper[5033]: I0319 19:50:06.039627 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-l92rr"] Mar 19 19:50:06 crc kubenswrapper[5033]: I0319 19:50:06.620980 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:50:06 crc kubenswrapper[5033]: E0319 19:50:06.621506 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:50:06 crc kubenswrapper[5033]: I0319 19:50:06.642833 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf2dd1b-44e9-442a-8f70-9ef6ff99230b" path="/var/lib/kubelet/pods/dbf2dd1b-44e9-442a-8f70-9ef6ff99230b/volumes" Mar 19 19:50:17 crc kubenswrapper[5033]: I0319 19:50:17.042129 5033 scope.go:117] "RemoveContainer" containerID="80f9e2108121b537c5088939aaea09b200275ca4214cfeec23f2af76e1fbeea6" Mar 19 19:50:17 crc kubenswrapper[5033]: I0319 19:50:17.620236 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:50:17 crc kubenswrapper[5033]: E0319 19:50:17.620750 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:50:31 crc kubenswrapper[5033]: I0319 19:50:31.621346 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:50:31 crc kubenswrapper[5033]: E0319 19:50:31.622248 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:50:45 crc kubenswrapper[5033]: I0319 19:50:45.621044 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:50:45 crc kubenswrapper[5033]: E0319 19:50:45.622233 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:50:59 crc kubenswrapper[5033]: I0319 19:50:59.621127 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:50:59 crc kubenswrapper[5033]: E0319 19:50:59.622335 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:51:12 crc kubenswrapper[5033]: I0319 19:51:12.621228 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:51:12 crc kubenswrapper[5033]: E0319 19:51:12.622117 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.136824 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:14 crc kubenswrapper[5033]: E0319 19:51:14.137274 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab0825a-c5eb-42df-9dcc-15e4f6686e42" containerName="oc" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.137288 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab0825a-c5eb-42df-9dcc-15e4f6686e42" containerName="oc" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.137494 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab0825a-c5eb-42df-9dcc-15e4f6686e42" containerName="oc" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.138935 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.153889 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.248663 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnn2\" (UniqueName: \"kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.249008 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.249058 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.351090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnn2\" (UniqueName: \"kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.351214 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.351315 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.351777 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.351837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.372229 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnn2\" (UniqueName: \"kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2\") pod \"redhat-operators-rq8kh\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.472095 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:14 crc kubenswrapper[5033]: I0319 19:51:14.919286 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:15 crc kubenswrapper[5033]: I0319 19:51:15.279243 5033 generic.go:334] "Generic (PLEG): container finished" podID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerID="4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7" exitCode=0 Mar 19 19:51:15 crc kubenswrapper[5033]: I0319 19:51:15.279304 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerDied","Data":"4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7"} Mar 19 19:51:15 crc kubenswrapper[5033]: I0319 19:51:15.279340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerStarted","Data":"380ace3d03983c96966e07e33a23319461b4f77388480783764d2ad8369a7822"} Mar 19 19:51:16 crc kubenswrapper[5033]: I0319 19:51:16.293888 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerStarted","Data":"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3"} Mar 19 19:51:17 crc kubenswrapper[5033]: I0319 19:51:17.119044 5033 scope.go:117] "RemoveContainer" containerID="1f899451c235f50517a105c9973d314319ce20edeb152ce40e40c11ac18457c8" Mar 19 19:51:17 crc kubenswrapper[5033]: I0319 19:51:17.141198 5033 scope.go:117] "RemoveContainer" containerID="1efda9a0dc34e3df8e9778a4263b83421e39e30a85db79899c50b5298a53de8b" Mar 19 19:51:17 crc kubenswrapper[5033]: I0319 19:51:17.190929 5033 scope.go:117] "RemoveContainer" containerID="b4fe435a7fd132fe1d089f14dab7ec9e9e24b5cd78f60a54f7e099d543aedcff" Mar 19 19:51:21 crc kubenswrapper[5033]: I0319 19:51:21.348129 5033 generic.go:334] "Generic (PLEG): container finished" podID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerID="49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3" exitCode=0 Mar 19 19:51:21 crc kubenswrapper[5033]: I0319 19:51:21.348230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerDied","Data":"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3"} Mar 19 19:51:22 crc kubenswrapper[5033]: I0319 19:51:22.364841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerStarted","Data":"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce"} Mar 19 19:51:22 crc kubenswrapper[5033]: I0319 19:51:22.400535 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rq8kh" podStartSLOduration=1.932284785 podStartE2EDuration="8.400511536s" podCreationTimestamp="2026-03-19 19:51:14 +0000 UTC" firstStartedPulling="2026-03-19 19:51:15.28228628 +0000 UTC m=+3285.387316129" lastFinishedPulling="2026-03-19 19:51:21.750513031 +0000 UTC m=+3291.855542880" observedRunningTime="2026-03-19 19:51:22.383084693 +0000 UTC m=+3292.488114542" watchObservedRunningTime="2026-03-19 19:51:22.400511536 +0000 UTC m=+3292.505541385" Mar 19 19:51:24 crc kubenswrapper[5033]: I0319 19:51:24.472292 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:24 crc kubenswrapper[5033]: I0319 19:51:24.472916 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:25 crc kubenswrapper[5033]: I0319 19:51:25.542507 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rq8kh" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" probeResult="failure" output=< Mar 19 19:51:25 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:51:25 crc kubenswrapper[5033]: > Mar 19 19:51:27 crc kubenswrapper[5033]: I0319 19:51:27.621344 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:51:27 crc kubenswrapper[5033]: E0319 19:51:27.622302 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:51:35 crc kubenswrapper[5033]: I0319 19:51:35.534698 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rq8kh" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" probeResult="failure" output=< Mar 19 19:51:35 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 19:51:35 crc kubenswrapper[5033]: > Mar 19 19:51:38 crc kubenswrapper[5033]: I0319 19:51:38.621240 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:51:38 crc kubenswrapper[5033]: E0319 19:51:38.621808 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:51:44 crc kubenswrapper[5033]: I0319 19:51:44.524803 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:44 crc kubenswrapper[5033]: I0319 19:51:44.588369 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:45 crc kubenswrapper[5033]: I0319 19:51:45.102737 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:45 crc kubenswrapper[5033]: I0319 19:51:45.563061 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rq8kh" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" containerID="cri-o://e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce" gracePeriod=2 Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.095669 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.189257 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities\") pod \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.189758 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wnn2\" (UniqueName: \"kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2\") pod \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.189819 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content\") pod \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\" (UID: \"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f\") " Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.190031 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities" (OuterVolumeSpecName: "utilities") pod "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" (UID: "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.190854 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.195235 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2" (OuterVolumeSpecName: "kube-api-access-5wnn2") pod "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" (UID: "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f"). InnerVolumeSpecName "kube-api-access-5wnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.293637 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wnn2\" (UniqueName: \"kubernetes.io/projected/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-kube-api-access-5wnn2\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.319092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" (UID: "f31dc3a9-b15d-4b3a-967c-c6dad3fc035f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.396045 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.578820 5033 generic.go:334] "Generic (PLEG): container finished" podID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerID="e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce" exitCode=0 Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.578865 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerDied","Data":"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce"} Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.578906 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rq8kh" event={"ID":"f31dc3a9-b15d-4b3a-967c-c6dad3fc035f","Type":"ContainerDied","Data":"380ace3d03983c96966e07e33a23319461b4f77388480783764d2ad8369a7822"} Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.578929 5033 scope.go:117] "RemoveContainer" containerID="e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.578951 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rq8kh" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.618073 5033 scope.go:117] "RemoveContainer" containerID="49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.639498 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.639605 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rq8kh"] Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.652887 5033 scope.go:117] "RemoveContainer" containerID="4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.700540 5033 scope.go:117] "RemoveContainer" containerID="e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce" Mar 19 19:51:46 crc kubenswrapper[5033]: E0319 19:51:46.701039 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce\": container with ID starting with e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce not found: ID does not exist" containerID="e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.701092 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce"} err="failed to get container status \"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce\": rpc error: code = NotFound desc = could not find container \"e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce\": container with ID starting with e2191de3b9330e2f777f69d052b8970f6a838d9b49d4d41a288c487cfb1364ce not found: ID does not exist" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.701127 5033 scope.go:117] "RemoveContainer" containerID="49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3" Mar 19 19:51:46 crc kubenswrapper[5033]: E0319 19:51:46.701802 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3\": container with ID starting with 49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3 not found: ID does not exist" containerID="49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.701843 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3"} err="failed to get container status \"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3\": rpc error: code = NotFound desc = could not find container \"49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3\": container with ID starting with 49f3051a65763cae152e8a2222d5221ddeeaea5f17e6e11cc6a0a196996a24f3 not found: ID does not exist" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.701871 5033 scope.go:117] "RemoveContainer" containerID="4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7" Mar 19 19:51:46 crc kubenswrapper[5033]: E0319 19:51:46.702215 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7\": container with ID starting with 4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7 not found: ID does not exist" containerID="4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7" Mar 19 19:51:46 crc kubenswrapper[5033]: I0319 19:51:46.702255 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7"} err="failed to get container status \"4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7\": rpc error: code = NotFound desc = could not find container \"4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7\": container with ID starting with 4f8e6c57fb3f17fa0a515ae319fbb5adda67264050dbcb39e26070314bf853f7 not found: ID does not exist" Mar 19 19:51:48 crc kubenswrapper[5033]: I0319 19:51:48.632648 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" path="/var/lib/kubelet/pods/f31dc3a9-b15d-4b3a-967c-c6dad3fc035f/volumes" Mar 19 19:51:49 crc kubenswrapper[5033]: I0319 19:51:49.621415 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:51:49 crc kubenswrapper[5033]: E0319 19:51:49.621910 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.157293 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565832-wncm9"] Mar 19 19:52:00 crc kubenswrapper[5033]: E0319 19:52:00.159140 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="extract-content" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.159176 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="extract-content" Mar 19 19:52:00 crc kubenswrapper[5033]: E0319 19:52:00.159227 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="extract-utilities" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.159245 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="extract-utilities" Mar 19 19:52:00 crc kubenswrapper[5033]: E0319 19:52:00.159313 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.159362 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.159950 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31dc3a9-b15d-4b3a-967c-c6dad3fc035f" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.161424 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.164775 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.164859 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.164872 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.172897 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-wncm9"] Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.189044 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phg5b\" (UniqueName: \"kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b\") pod \"auto-csr-approver-29565832-wncm9\" (UID: \"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1\") " pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.291444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phg5b\" (UniqueName: \"kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b\") pod \"auto-csr-approver-29565832-wncm9\" (UID: \"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1\") " pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.308154 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phg5b\" (UniqueName: \"kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b\") pod \"auto-csr-approver-29565832-wncm9\" (UID: \"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1\") " pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.484495 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:00 crc kubenswrapper[5033]: I0319 19:52:00.928224 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-wncm9"] Mar 19 19:52:01 crc kubenswrapper[5033]: I0319 19:52:01.737587 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-wncm9" event={"ID":"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1","Type":"ContainerStarted","Data":"c19dc8a2d5f2c23c5503f286470a832ca25a559ff01cc5bbcee7f314d5ad9c2f"} Mar 19 19:52:02 crc kubenswrapper[5033]: I0319 19:52:02.622098 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:52:02 crc kubenswrapper[5033]: E0319 19:52:02.623134 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:52:02 crc kubenswrapper[5033]: I0319 19:52:02.749971 5033 generic.go:334] "Generic (PLEG): container finished" podID="9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" containerID="1e3e0ace502610718aab4add7fcc50bdbc2bda1d3d4b29d95816b7d7659aff82" exitCode=0 Mar 19 19:52:02 crc kubenswrapper[5033]: I0319 19:52:02.750044 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-wncm9" event={"ID":"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1","Type":"ContainerDied","Data":"1e3e0ace502610718aab4add7fcc50bdbc2bda1d3d4b29d95816b7d7659aff82"} Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.224903 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.277237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phg5b\" (UniqueName: \"kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b\") pod \"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1\" (UID: \"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1\") " Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.282057 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b" (OuterVolumeSpecName: "kube-api-access-phg5b") pod "9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" (UID: "9eb7a216-801f-4c2f-ad6c-1aa009ac10b1"). InnerVolumeSpecName "kube-api-access-phg5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.379893 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phg5b\" (UniqueName: \"kubernetes.io/projected/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1-kube-api-access-phg5b\") on node \"crc\" DevicePath \"\"" Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.771116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-wncm9" event={"ID":"9eb7a216-801f-4c2f-ad6c-1aa009ac10b1","Type":"ContainerDied","Data":"c19dc8a2d5f2c23c5503f286470a832ca25a559ff01cc5bbcee7f314d5ad9c2f"} Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.771168 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-wncm9" Mar 19 19:52:04 crc kubenswrapper[5033]: I0319 19:52:04.771173 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19dc8a2d5f2c23c5503f286470a832ca25a559ff01cc5bbcee7f314d5ad9c2f" Mar 19 19:52:05 crc kubenswrapper[5033]: I0319 19:52:05.290827 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-fg6wn"] Mar 19 19:52:05 crc kubenswrapper[5033]: I0319 19:52:05.301753 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-fg6wn"] Mar 19 19:52:06 crc kubenswrapper[5033]: I0319 19:52:06.639650 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb134de-009b-472e-9c57-b39f7b1fa5ec" path="/var/lib/kubelet/pods/6bb134de-009b-472e-9c57-b39f7b1fa5ec/volumes" Mar 19 19:52:16 crc kubenswrapper[5033]: I0319 19:52:16.621617 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:52:16 crc kubenswrapper[5033]: E0319 19:52:16.622922 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:52:17 crc kubenswrapper[5033]: I0319 19:52:17.277186 5033 scope.go:117] "RemoveContainer" containerID="2f89c2e06e868dbdbc544812487ac4a8aa35ce4986d788123602210b15ae0ba9" Mar 19 19:52:29 crc kubenswrapper[5033]: I0319 19:52:29.620072 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:52:29 crc kubenswrapper[5033]: E0319 19:52:29.620827 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:52:40 crc kubenswrapper[5033]: I0319 19:52:40.633672 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:52:40 crc kubenswrapper[5033]: E0319 19:52:40.635119 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:52:51 crc kubenswrapper[5033]: I0319 19:52:51.620292 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:52:51 crc kubenswrapper[5033]: E0319 19:52:51.621126 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:53:02 crc kubenswrapper[5033]: I0319 19:53:02.621064 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:53:02 crc kubenswrapper[5033]: E0319 19:53:02.622017 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:53:16 crc kubenswrapper[5033]: I0319 19:53:16.620890 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:53:16 crc kubenswrapper[5033]: E0319 19:53:16.622030 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:53:30 crc kubenswrapper[5033]: I0319 19:53:30.627961 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:53:30 crc kubenswrapper[5033]: E0319 19:53:30.628863 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:53:42 crc kubenswrapper[5033]: I0319 19:53:42.620500 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:53:42 crc kubenswrapper[5033]: E0319 19:53:42.621204 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:53:53 crc kubenswrapper[5033]: I0319 19:53:53.621190 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:53:53 crc kubenswrapper[5033]: E0319 19:53:53.621985 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.164842 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565834-rdp8m"] Mar 19 19:54:00 crc kubenswrapper[5033]: E0319 19:54:00.166230 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.166254 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.166667 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.167894 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.175165 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.175645 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.175723 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.184983 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-rdp8m"] Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.231747 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mww8c\" (UniqueName: \"kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c\") pod \"auto-csr-approver-29565834-rdp8m\" (UID: \"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe\") " pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.333908 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mww8c\" (UniqueName: \"kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c\") pod \"auto-csr-approver-29565834-rdp8m\" (UID: \"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe\") " pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.355141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mww8c\" (UniqueName: \"kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c\") pod \"auto-csr-approver-29565834-rdp8m\" (UID: \"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe\") " pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.506620 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.981719 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-rdp8m"] Mar 19 19:54:00 crc kubenswrapper[5033]: I0319 19:54:00.983165 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:54:01 crc kubenswrapper[5033]: I0319 19:54:01.040105 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" event={"ID":"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe","Type":"ContainerStarted","Data":"acfac6077a7ded1dc5a1f3d2f5eff6e61da81960803dcf884ee34fcecd9bc2e3"} Mar 19 19:54:03 crc kubenswrapper[5033]: I0319 19:54:03.066924 5033 generic.go:334] "Generic (PLEG): container finished" podID="245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" containerID="3afcd07acd026610888aa77b0856e98a5304695c9c0118ef14986b1b3985c77b" exitCode=0 Mar 19 19:54:03 crc kubenswrapper[5033]: I0319 19:54:03.066979 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" event={"ID":"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe","Type":"ContainerDied","Data":"3afcd07acd026610888aa77b0856e98a5304695c9c0118ef14986b1b3985c77b"} Mar 19 19:54:04 crc kubenswrapper[5033]: I0319 19:54:04.520669 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:04 crc kubenswrapper[5033]: I0319 19:54:04.628090 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mww8c\" (UniqueName: \"kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c\") pod \"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe\" (UID: \"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe\") " Mar 19 19:54:04 crc kubenswrapper[5033]: I0319 19:54:04.633382 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c" (OuterVolumeSpecName: "kube-api-access-mww8c") pod "245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" (UID: "245ffa8e-4d63-4f15-be05-f0f3fd6cecbe"). InnerVolumeSpecName "kube-api-access-mww8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:54:04 crc kubenswrapper[5033]: I0319 19:54:04.732982 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mww8c\" (UniqueName: \"kubernetes.io/projected/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe-kube-api-access-mww8c\") on node \"crc\" DevicePath \"\"" Mar 19 19:54:05 crc kubenswrapper[5033]: I0319 19:54:05.094149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" event={"ID":"245ffa8e-4d63-4f15-be05-f0f3fd6cecbe","Type":"ContainerDied","Data":"acfac6077a7ded1dc5a1f3d2f5eff6e61da81960803dcf884ee34fcecd9bc2e3"} Mar 19 19:54:05 crc kubenswrapper[5033]: I0319 19:54:05.094192 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acfac6077a7ded1dc5a1f3d2f5eff6e61da81960803dcf884ee34fcecd9bc2e3" Mar 19 19:54:05 crc kubenswrapper[5033]: I0319 19:54:05.094255 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-rdp8m" Mar 19 19:54:05 crc kubenswrapper[5033]: I0319 19:54:05.604936 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-wksld"] Mar 19 19:54:05 crc kubenswrapper[5033]: I0319 19:54:05.613476 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-wksld"] Mar 19 19:54:06 crc kubenswrapper[5033]: I0319 19:54:06.631325 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3499873d-d658-4518-a7a0-856060368e78" path="/var/lib/kubelet/pods/3499873d-d658-4518-a7a0-856060368e78/volumes" Mar 19 19:54:07 crc kubenswrapper[5033]: I0319 19:54:07.621214 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:54:07 crc kubenswrapper[5033]: E0319 19:54:07.621530 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.393156 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4bw7v"] Mar 19 19:54:15 crc kubenswrapper[5033]: E0319 19:54:15.394523 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" containerName="oc" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.394547 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" containerName="oc" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.394900 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" containerName="oc" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.398583 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.422057 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4bw7v"] Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.475313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-utilities\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.475373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvbh\" (UniqueName: \"kubernetes.io/projected/76b35dca-6939-425e-80ce-4f8801214a28-kube-api-access-wrvbh\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.475399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-catalog-content\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.578061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-utilities\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.578181 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvbh\" (UniqueName: \"kubernetes.io/projected/76b35dca-6939-425e-80ce-4f8801214a28-kube-api-access-wrvbh\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.578213 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-catalog-content\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.579181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-utilities\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.579232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76b35dca-6939-425e-80ce-4f8801214a28-catalog-content\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.601990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvbh\" (UniqueName: \"kubernetes.io/projected/76b35dca-6939-425e-80ce-4f8801214a28-kube-api-access-wrvbh\") pod \"certified-operators-4bw7v\" (UID: \"76b35dca-6939-425e-80ce-4f8801214a28\") " pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:15 crc kubenswrapper[5033]: I0319 19:54:15.723430 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:16 crc kubenswrapper[5033]: I0319 19:54:16.264879 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4bw7v"] Mar 19 19:54:17 crc kubenswrapper[5033]: I0319 19:54:17.228465 5033 generic.go:334] "Generic (PLEG): container finished" podID="76b35dca-6939-425e-80ce-4f8801214a28" containerID="fd9219b08fa8b95d656f76087546bf4ace317d16b51fae5d64c47c266aeed69f" exitCode=0 Mar 19 19:54:17 crc kubenswrapper[5033]: I0319 19:54:17.228521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4bw7v" event={"ID":"76b35dca-6939-425e-80ce-4f8801214a28","Type":"ContainerDied","Data":"fd9219b08fa8b95d656f76087546bf4ace317d16b51fae5d64c47c266aeed69f"} Mar 19 19:54:17 crc kubenswrapper[5033]: I0319 19:54:17.228903 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4bw7v" event={"ID":"76b35dca-6939-425e-80ce-4f8801214a28","Type":"ContainerStarted","Data":"ffea58e63600f1919db29fdbb990f1e423a741a58760e113d5158b9c59a91de2"} Mar 19 19:54:17 crc kubenswrapper[5033]: I0319 19:54:17.401473 5033 scope.go:117] "RemoveContainer" containerID="76af51119966991733dc50fb61c7517eda67919960948e30f77c3e2ea2f7e912" Mar 19 19:54:22 crc kubenswrapper[5033]: I0319 19:54:22.288689 5033 generic.go:334] "Generic (PLEG): container finished" podID="76b35dca-6939-425e-80ce-4f8801214a28" containerID="8343a49afdbc8ed05c45ae6606f73a2d20114df05cddf3ccec239a3985eb3981" exitCode=0 Mar 19 19:54:22 crc kubenswrapper[5033]: I0319 19:54:22.288752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4bw7v" event={"ID":"76b35dca-6939-425e-80ce-4f8801214a28","Type":"ContainerDied","Data":"8343a49afdbc8ed05c45ae6606f73a2d20114df05cddf3ccec239a3985eb3981"} Mar 19 19:54:22 crc kubenswrapper[5033]: I0319 19:54:22.622778 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:54:22 crc kubenswrapper[5033]: E0319 19:54:22.623044 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:54:23 crc kubenswrapper[5033]: I0319 19:54:23.300384 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4bw7v" event={"ID":"76b35dca-6939-425e-80ce-4f8801214a28","Type":"ContainerStarted","Data":"4d5c5018eb1635c7e3320430b4c73e2c0f1f0e9207b3c686e3c3e3d3ab211e11"} Mar 19 19:54:23 crc kubenswrapper[5033]: I0319 19:54:23.317898 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4bw7v" podStartSLOduration=2.8011865670000002 podStartE2EDuration="8.31788077s" podCreationTimestamp="2026-03-19 19:54:15 +0000 UTC" firstStartedPulling="2026-03-19 19:54:17.2321152 +0000 UTC m=+3467.337145049" lastFinishedPulling="2026-03-19 19:54:22.748809403 +0000 UTC m=+3472.853839252" observedRunningTime="2026-03-19 19:54:23.31363701 +0000 UTC m=+3473.418666869" watchObservedRunningTime="2026-03-19 19:54:23.31788077 +0000 UTC m=+3473.422910619" Mar 19 19:54:25 crc kubenswrapper[5033]: I0319 19:54:25.724411 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:25 crc kubenswrapper[5033]: I0319 19:54:25.724934 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:25 crc kubenswrapper[5033]: I0319 19:54:25.767875 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:35 crc kubenswrapper[5033]: I0319 19:54:35.777399 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4bw7v" Mar 19 19:54:35 crc kubenswrapper[5033]: I0319 19:54:35.845164 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4bw7v"] Mar 19 19:54:35 crc kubenswrapper[5033]: I0319 19:54:35.889009 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:54:35 crc kubenswrapper[5033]: I0319 19:54:35.889474 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c5l4v" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="registry-server" containerID="cri-o://73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b" gracePeriod=2 Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.421928 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.437265 5033 generic.go:334] "Generic (PLEG): container finished" podID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerID="73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b" exitCode=0 Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.437337 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5l4v" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.437369 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerDied","Data":"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b"} Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.437421 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5l4v" event={"ID":"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5","Type":"ContainerDied","Data":"41ae5a34c3ef319ce37c9772d355a8beef927b0b071b16b8e6b3c3910b563400"} Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.437443 5033 scope.go:117] "RemoveContainer" containerID="73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.464241 5033 scope.go:117] "RemoveContainer" containerID="ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.485661 5033 scope.go:117] "RemoveContainer" containerID="ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.535857 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content\") pod \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.536038 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85l5n\" (UniqueName: \"kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n\") pod \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.536132 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities\") pod \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\" (UID: \"a6eeb6a4-deb0-4da9-a897-f46c8f7913f5\") " Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.537614 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities" (OuterVolumeSpecName: "utilities") pod "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" (UID: "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.543096 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n" (OuterVolumeSpecName: "kube-api-access-85l5n") pod "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" (UID: "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5"). InnerVolumeSpecName "kube-api-access-85l5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.564332 5033 scope.go:117] "RemoveContainer" containerID="73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b" Mar 19 19:54:36 crc kubenswrapper[5033]: E0319 19:54:36.568550 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b\": container with ID starting with 73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b not found: ID does not exist" containerID="73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.568589 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b"} err="failed to get container status \"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b\": rpc error: code = NotFound desc = could not find container \"73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b\": container with ID starting with 73431aac6b7667bcffdbf63936f8d64b42d5507d26f1a47aebab867dc56c939b not found: ID does not exist" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.568609 5033 scope.go:117] "RemoveContainer" containerID="ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba" Mar 19 19:54:36 crc kubenswrapper[5033]: E0319 19:54:36.573839 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba\": container with ID starting with ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba not found: ID does not exist" containerID="ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.573875 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba"} err="failed to get container status \"ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba\": rpc error: code = NotFound desc = could not find container \"ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba\": container with ID starting with ea8f890bef81da089974adde26fa7383f18bdba5fd00d2682c2f374ede723eba not found: ID does not exist" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.573896 5033 scope.go:117] "RemoveContainer" containerID="ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1" Mar 19 19:54:36 crc kubenswrapper[5033]: E0319 19:54:36.576747 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1\": container with ID starting with ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1 not found: ID does not exist" containerID="ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.576794 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1"} err="failed to get container status \"ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1\": rpc error: code = NotFound desc = could not find container \"ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1\": container with ID starting with ab18990ba7af2f36fb9d84e4c6e8583810d4f3aa16aaf9fe6144f2f5f2001ea1 not found: ID does not exist" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.599320 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" (UID: "a6eeb6a4-deb0-4da9-a897-f46c8f7913f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.638050 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.638084 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85l5n\" (UniqueName: \"kubernetes.io/projected/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-kube-api-access-85l5n\") on node \"crc\" DevicePath \"\"" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.638099 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.763724 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:54:36 crc kubenswrapper[5033]: I0319 19:54:36.772088 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c5l4v"] Mar 19 19:54:37 crc kubenswrapper[5033]: I0319 19:54:37.621057 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:54:37 crc kubenswrapper[5033]: E0319 19:54:37.621807 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 19:54:38 crc kubenswrapper[5033]: I0319 19:54:38.633414 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" path="/var/lib/kubelet/pods/a6eeb6a4-deb0-4da9-a897-f46c8f7913f5/volumes" Mar 19 19:54:52 crc kubenswrapper[5033]: I0319 19:54:52.621860 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:54:53 crc kubenswrapper[5033]: I0319 19:54:53.596849 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841"} Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.735487 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:24 crc kubenswrapper[5033]: E0319 19:55:24.736530 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="extract-content" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.736546 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="extract-content" Mar 19 19:55:24 crc kubenswrapper[5033]: E0319 19:55:24.736570 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="registry-server" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.736576 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="registry-server" Mar 19 19:55:24 crc kubenswrapper[5033]: E0319 19:55:24.736594 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="extract-utilities" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.736601 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="extract-utilities" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.736805 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6eeb6a4-deb0-4da9-a897-f46c8f7913f5" containerName="registry-server" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.738441 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.744477 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.916404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.916902 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:24 crc kubenswrapper[5033]: I0319 19:55:24.916984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnfd5\" (UniqueName: \"kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.019337 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.019408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnfd5\" (UniqueName: \"kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.019524 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.019840 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.020093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.039863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnfd5\" (UniqueName: \"kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5\") pod \"redhat-marketplace-4ldgw\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.073921 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.536064 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.969341 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerID="0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd" exitCode=0 Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.969388 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerDied","Data":"0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd"} Mar 19 19:55:25 crc kubenswrapper[5033]: I0319 19:55:25.969418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerStarted","Data":"4a9c551da591af43ea49b57f537cce19607fff46e69aa3ef23d4441aadd32251"} Mar 19 19:55:26 crc kubenswrapper[5033]: I0319 19:55:26.979721 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerStarted","Data":"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58"} Mar 19 19:55:29 crc kubenswrapper[5033]: I0319 19:55:29.006899 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerID="3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58" exitCode=0 Mar 19 19:55:29 crc kubenswrapper[5033]: I0319 19:55:29.006986 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerDied","Data":"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58"} Mar 19 19:55:30 crc kubenswrapper[5033]: I0319 19:55:30.018315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerStarted","Data":"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f"} Mar 19 19:55:30 crc kubenswrapper[5033]: I0319 19:55:30.042034 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ldgw" podStartSLOduration=2.453167247 podStartE2EDuration="6.04201607s" podCreationTimestamp="2026-03-19 19:55:24 +0000 UTC" firstStartedPulling="2026-03-19 19:55:25.971386478 +0000 UTC m=+3536.076416337" lastFinishedPulling="2026-03-19 19:55:29.560235311 +0000 UTC m=+3539.665265160" observedRunningTime="2026-03-19 19:55:30.035667391 +0000 UTC m=+3540.140697240" watchObservedRunningTime="2026-03-19 19:55:30.04201607 +0000 UTC m=+3540.147045919" Mar 19 19:55:35 crc kubenswrapper[5033]: I0319 19:55:35.074765 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:35 crc kubenswrapper[5033]: I0319 19:55:35.076350 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:35 crc kubenswrapper[5033]: I0319 19:55:35.137299 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:36 crc kubenswrapper[5033]: I0319 19:55:36.145842 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:36 crc kubenswrapper[5033]: I0319 19:55:36.204294 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.106816 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ldgw" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="registry-server" containerID="cri-o://f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f" gracePeriod=2 Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.608567 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.705236 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnfd5\" (UniqueName: \"kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5\") pod \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.705302 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities\") pod \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.705359 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content\") pod \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\" (UID: \"9e91d9e2-f569-4663-8cf5-d9d27a8c21af\") " Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.706093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities" (OuterVolumeSpecName: "utilities") pod "9e91d9e2-f569-4663-8cf5-d9d27a8c21af" (UID: "9e91d9e2-f569-4663-8cf5-d9d27a8c21af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.712157 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5" (OuterVolumeSpecName: "kube-api-access-fnfd5") pod "9e91d9e2-f569-4663-8cf5-d9d27a8c21af" (UID: "9e91d9e2-f569-4663-8cf5-d9d27a8c21af"). InnerVolumeSpecName "kube-api-access-fnfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.733855 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e91d9e2-f569-4663-8cf5-d9d27a8c21af" (UID: "9e91d9e2-f569-4663-8cf5-d9d27a8c21af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.807415 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.807465 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:38 crc kubenswrapper[5033]: I0319 19:55:38.807479 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnfd5\" (UniqueName: \"kubernetes.io/projected/9e91d9e2-f569-4663-8cf5-d9d27a8c21af-kube-api-access-fnfd5\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.122160 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerID="f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f" exitCode=0 Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.122529 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ldgw" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.122558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerDied","Data":"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f"} Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.123286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ldgw" event={"ID":"9e91d9e2-f569-4663-8cf5-d9d27a8c21af","Type":"ContainerDied","Data":"4a9c551da591af43ea49b57f537cce19607fff46e69aa3ef23d4441aadd32251"} Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.123317 5033 scope.go:117] "RemoveContainer" containerID="f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.157645 5033 scope.go:117] "RemoveContainer" containerID="3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.187619 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.201206 5033 scope.go:117] "RemoveContainer" containerID="0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.201206 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ldgw"] Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.249698 5033 scope.go:117] "RemoveContainer" containerID="f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f" Mar 19 19:55:39 crc kubenswrapper[5033]: E0319 19:55:39.250223 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f\": container with ID starting with f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f not found: ID does not exist" containerID="f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.250261 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f"} err="failed to get container status \"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f\": rpc error: code = NotFound desc = could not find container \"f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f\": container with ID starting with f226cd7feb18e9f08ebad13a3dda2e0abd69204f12a1632bb2e82a1a91ae0a0f not found: ID does not exist" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.250287 5033 scope.go:117] "RemoveContainer" containerID="3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58" Mar 19 19:55:39 crc kubenswrapper[5033]: E0319 19:55:39.250630 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58\": container with ID starting with 3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58 not found: ID does not exist" containerID="3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.250689 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58"} err="failed to get container status \"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58\": rpc error: code = NotFound desc = could not find container \"3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58\": container with ID starting with 3be58d3dccd579d986a22f625b94bd5d4f951a544e7220ee2ad6e67e6c9f2e58 not found: ID does not exist" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.250715 5033 scope.go:117] "RemoveContainer" containerID="0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd" Mar 19 19:55:39 crc kubenswrapper[5033]: E0319 19:55:39.251056 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd\": container with ID starting with 0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd not found: ID does not exist" containerID="0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd" Mar 19 19:55:39 crc kubenswrapper[5033]: I0319 19:55:39.251091 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd"} err="failed to get container status \"0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd\": rpc error: code = NotFound desc = could not find container \"0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd\": container with ID starting with 0211b43592d8e2ab3990274960fab1d23becb5543079dbc63c72a635f957e3cd not found: ID does not exist" Mar 19 19:55:40 crc kubenswrapper[5033]: I0319 19:55:40.646301 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" path="/var/lib/kubelet/pods/9e91d9e2-f569-4663-8cf5-d9d27a8c21af/volumes" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.148089 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565836-xhwbf"] Mar 19 19:56:00 crc kubenswrapper[5033]: E0319 19:56:00.149056 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.149072 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[5033]: E0319 19:56:00.149089 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="extract-content" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.149095 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="extract-content" Mar 19 19:56:00 crc kubenswrapper[5033]: E0319 19:56:00.149106 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="extract-utilities" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.149112 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="extract-utilities" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.149290 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e91d9e2-f569-4663-8cf5-d9d27a8c21af" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.150156 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.152288 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.152446 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.154523 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.181816 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-xhwbf"] Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.267921 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8csc8\" (UniqueName: \"kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8\") pod \"auto-csr-approver-29565836-xhwbf\" (UID: \"27a7b950-c618-442c-922d-f0c6f91463cb\") " pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.370536 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8csc8\" (UniqueName: \"kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8\") pod \"auto-csr-approver-29565836-xhwbf\" (UID: \"27a7b950-c618-442c-922d-f0c6f91463cb\") " pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.390188 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8csc8\" (UniqueName: \"kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8\") pod \"auto-csr-approver-29565836-xhwbf\" (UID: \"27a7b950-c618-442c-922d-f0c6f91463cb\") " pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.479540 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:00 crc kubenswrapper[5033]: I0319 19:56:00.948105 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-xhwbf"] Mar 19 19:56:01 crc kubenswrapper[5033]: I0319 19:56:01.326383 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" event={"ID":"27a7b950-c618-442c-922d-f0c6f91463cb","Type":"ContainerStarted","Data":"72ac68c1a7bcf96b5777c338a59a0a8483957e1248ecc26cfe6d62ebe1ae0648"} Mar 19 19:56:03 crc kubenswrapper[5033]: I0319 19:56:03.349433 5033 generic.go:334] "Generic (PLEG): container finished" podID="27a7b950-c618-442c-922d-f0c6f91463cb" containerID="d9bf45b322e01e033941d9c6618511fc87d7cc01b3d40a18a86a87889ed15d2f" exitCode=0 Mar 19 19:56:03 crc kubenswrapper[5033]: I0319 19:56:03.349550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" event={"ID":"27a7b950-c618-442c-922d-f0c6f91463cb","Type":"ContainerDied","Data":"d9bf45b322e01e033941d9c6618511fc87d7cc01b3d40a18a86a87889ed15d2f"} Mar 19 19:56:04 crc kubenswrapper[5033]: I0319 19:56:04.757053 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:04 crc kubenswrapper[5033]: I0319 19:56:04.867587 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8csc8\" (UniqueName: \"kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8\") pod \"27a7b950-c618-442c-922d-f0c6f91463cb\" (UID: \"27a7b950-c618-442c-922d-f0c6f91463cb\") " Mar 19 19:56:04 crc kubenswrapper[5033]: I0319 19:56:04.875784 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8" (OuterVolumeSpecName: "kube-api-access-8csc8") pod "27a7b950-c618-442c-922d-f0c6f91463cb" (UID: "27a7b950-c618-442c-922d-f0c6f91463cb"). InnerVolumeSpecName "kube-api-access-8csc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:56:04 crc kubenswrapper[5033]: I0319 19:56:04.970222 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8csc8\" (UniqueName: \"kubernetes.io/projected/27a7b950-c618-442c-922d-f0c6f91463cb-kube-api-access-8csc8\") on node \"crc\" DevicePath \"\"" Mar 19 19:56:05 crc kubenswrapper[5033]: I0319 19:56:05.375606 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" event={"ID":"27a7b950-c618-442c-922d-f0c6f91463cb","Type":"ContainerDied","Data":"72ac68c1a7bcf96b5777c338a59a0a8483957e1248ecc26cfe6d62ebe1ae0648"} Mar 19 19:56:05 crc kubenswrapper[5033]: I0319 19:56:05.375667 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72ac68c1a7bcf96b5777c338a59a0a8483957e1248ecc26cfe6d62ebe1ae0648" Mar 19 19:56:05 crc kubenswrapper[5033]: I0319 19:56:05.375684 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-xhwbf" Mar 19 19:56:05 crc kubenswrapper[5033]: I0319 19:56:05.837187 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-tks4t"] Mar 19 19:56:05 crc kubenswrapper[5033]: I0319 19:56:05.845888 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-tks4t"] Mar 19 19:56:06 crc kubenswrapper[5033]: I0319 19:56:06.631563 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab0825a-c5eb-42df-9dcc-15e4f6686e42" path="/var/lib/kubelet/pods/eab0825a-c5eb-42df-9dcc-15e4f6686e42/volumes" Mar 19 19:56:17 crc kubenswrapper[5033]: I0319 19:56:17.541231 5033 scope.go:117] "RemoveContainer" containerID="1ae27b480fe2169bd8c8bd6c8a867040c697e4849027866b73ee2c2c4dcf5d4e" Mar 19 19:57:10 crc kubenswrapper[5033]: I0319 19:57:10.758614 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:57:10 crc kubenswrapper[5033]: I0319 19:57:10.759111 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:57:40 crc kubenswrapper[5033]: I0319 19:57:40.758619 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:57:40 crc kubenswrapper[5033]: I0319 19:57:40.760996 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.153221 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565838-7fqvw"] Mar 19 19:58:00 crc kubenswrapper[5033]: E0319 19:58:00.154247 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a7b950-c618-442c-922d-f0c6f91463cb" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.154261 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a7b950-c618-442c-922d-f0c6f91463cb" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.154495 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a7b950-c618-442c-922d-f0c6f91463cb" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.155225 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.157337 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.157716 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.158507 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.173232 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-7fqvw"] Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.305390 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8q2q\" (UniqueName: \"kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q\") pod \"auto-csr-approver-29565838-7fqvw\" (UID: \"09d57273-84e1-4d47-a881-11916cf47217\") " pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.407765 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8q2q\" (UniqueName: \"kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q\") pod \"auto-csr-approver-29565838-7fqvw\" (UID: \"09d57273-84e1-4d47-a881-11916cf47217\") " pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.427003 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8q2q\" (UniqueName: \"kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q\") pod \"auto-csr-approver-29565838-7fqvw\" (UID: \"09d57273-84e1-4d47-a881-11916cf47217\") " pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.475396 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:00 crc kubenswrapper[5033]: I0319 19:58:00.947311 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-7fqvw"] Mar 19 19:58:01 crc kubenswrapper[5033]: I0319 19:58:01.752548 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" event={"ID":"09d57273-84e1-4d47-a881-11916cf47217","Type":"ContainerStarted","Data":"8b53858208324be620865499604c1b9b3d6f05af9928b4cd7c0fb5080fa32304"} Mar 19 19:58:02 crc kubenswrapper[5033]: I0319 19:58:02.763833 5033 generic.go:334] "Generic (PLEG): container finished" podID="09d57273-84e1-4d47-a881-11916cf47217" containerID="11c67242ce89603f04863dcf7fc3e4da85746d9f8128b352d583e05336ab73d7" exitCode=0 Mar 19 19:58:02 crc kubenswrapper[5033]: I0319 19:58:02.764061 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" event={"ID":"09d57273-84e1-4d47-a881-11916cf47217","Type":"ContainerDied","Data":"11c67242ce89603f04863dcf7fc3e4da85746d9f8128b352d583e05336ab73d7"} Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.322838 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.396532 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8q2q\" (UniqueName: \"kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q\") pod \"09d57273-84e1-4d47-a881-11916cf47217\" (UID: \"09d57273-84e1-4d47-a881-11916cf47217\") " Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.402271 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q" (OuterVolumeSpecName: "kube-api-access-v8q2q") pod "09d57273-84e1-4d47-a881-11916cf47217" (UID: "09d57273-84e1-4d47-a881-11916cf47217"). InnerVolumeSpecName "kube-api-access-v8q2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.500640 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8q2q\" (UniqueName: \"kubernetes.io/projected/09d57273-84e1-4d47-a881-11916cf47217-kube-api-access-v8q2q\") on node \"crc\" DevicePath \"\"" Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.788619 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" event={"ID":"09d57273-84e1-4d47-a881-11916cf47217","Type":"ContainerDied","Data":"8b53858208324be620865499604c1b9b3d6f05af9928b4cd7c0fb5080fa32304"} Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.788682 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b53858208324be620865499604c1b9b3d6f05af9928b4cd7c0fb5080fa32304" Mar 19 19:58:04 crc kubenswrapper[5033]: I0319 19:58:04.788739 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-7fqvw" Mar 19 19:58:05 crc kubenswrapper[5033]: I0319 19:58:05.401409 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-wncm9"] Mar 19 19:58:05 crc kubenswrapper[5033]: I0319 19:58:05.412268 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-wncm9"] Mar 19 19:58:06 crc kubenswrapper[5033]: I0319 19:58:06.631828 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb7a216-801f-4c2f-ad6c-1aa009ac10b1" path="/var/lib/kubelet/pods/9eb7a216-801f-4c2f-ad6c-1aa009ac10b1/volumes" Mar 19 19:58:10 crc kubenswrapper[5033]: I0319 19:58:10.758232 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:58:10 crc kubenswrapper[5033]: I0319 19:58:10.758759 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:58:10 crc kubenswrapper[5033]: I0319 19:58:10.758802 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 19:58:10 crc kubenswrapper[5033]: I0319 19:58:10.759776 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:58:10 crc kubenswrapper[5033]: I0319 19:58:10.759837 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841" gracePeriod=600 Mar 19 19:58:11 crc kubenswrapper[5033]: I0319 19:58:11.859241 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841" exitCode=0 Mar 19 19:58:11 crc kubenswrapper[5033]: I0319 19:58:11.859538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841"} Mar 19 19:58:11 crc kubenswrapper[5033]: I0319 19:58:11.859814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1"} Mar 19 19:58:11 crc kubenswrapper[5033]: I0319 19:58:11.859838 5033 scope.go:117] "RemoveContainer" containerID="7a88bb75b06dd8e5013f93648f791b92e7144d613860d6b0b25d85c51fd0acf1" Mar 19 19:58:18 crc kubenswrapper[5033]: I0319 19:58:18.212577 5033 scope.go:117] "RemoveContainer" containerID="1e3e0ace502610718aab4add7fcc50bdbc2bda1d3d4b29d95816b7d7659aff82" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.158124 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt"] Mar 19 20:00:00 crc kubenswrapper[5033]: E0319 20:00:00.159036 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d57273-84e1-4d47-a881-11916cf47217" containerName="oc" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.159049 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d57273-84e1-4d47-a881-11916cf47217" containerName="oc" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.159257 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d57273-84e1-4d47-a881-11916cf47217" containerName="oc" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.160042 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.162140 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.162373 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.168835 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565840-8cxkh"] Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.171442 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.173943 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.174318 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.174998 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.179134 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt"] Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.188103 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-8cxkh"] Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.293266 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvp2h\" (UniqueName: \"kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h\") pod \"auto-csr-approver-29565840-8cxkh\" (UID: \"28f37a5b-e76e-4a89-90f7-33796c755dc5\") " pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.293408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.293569 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbs6\" (UniqueName: \"kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.293609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.395651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvp2h\" (UniqueName: \"kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h\") pod \"auto-csr-approver-29565840-8cxkh\" (UID: \"28f37a5b-e76e-4a89-90f7-33796c755dc5\") " pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.395727 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.395801 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbs6\" (UniqueName: \"kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.395823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.397001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.402835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.415405 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvp2h\" (UniqueName: \"kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h\") pod \"auto-csr-approver-29565840-8cxkh\" (UID: \"28f37a5b-e76e-4a89-90f7-33796c755dc5\") " pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.416747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbs6\" (UniqueName: \"kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6\") pod \"collect-profiles-29565840-29xtt\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.485757 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.495939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:00 crc kubenswrapper[5033]: I0319 20:00:00.987822 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt"] Mar 19 20:00:00 crc kubenswrapper[5033]: W0319 20:00:00.989728 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da23f6d_d8c0_41c1_8eff_4ddeb4f1b2ad.slice/crio-bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b WatchSource:0}: Error finding container bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b: Status 404 returned error can't find the container with id bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.096073 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-8cxkh"] Mar 19 20:00:01 crc kubenswrapper[5033]: W0319 20:00:01.102515 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f37a5b_e76e_4a89_90f7_33796c755dc5.slice/crio-05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42 WatchSource:0}: Error finding container 05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42: Status 404 returned error can't find the container with id 05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42 Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.107390 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.179772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" event={"ID":"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad","Type":"ContainerStarted","Data":"4263d185eb78e1c9cba1b2fe69625cc49d8ec4ebd8f2b51ca7c07a02412d6a9d"} Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.180852 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" event={"ID":"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad","Type":"ContainerStarted","Data":"bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b"} Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.181125 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" event={"ID":"28f37a5b-e76e-4a89-90f7-33796c755dc5","Type":"ContainerStarted","Data":"05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42"} Mar 19 20:00:01 crc kubenswrapper[5033]: I0319 20:00:01.201939 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" podStartSLOduration=1.201918682 podStartE2EDuration="1.201918682s" podCreationTimestamp="2026-03-19 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:00:01.194616776 +0000 UTC m=+3811.299646625" watchObservedRunningTime="2026-03-19 20:00:01.201918682 +0000 UTC m=+3811.306948531" Mar 19 20:00:02 crc kubenswrapper[5033]: I0319 20:00:02.196098 5033 generic.go:334] "Generic (PLEG): container finished" podID="2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" containerID="4263d185eb78e1c9cba1b2fe69625cc49d8ec4ebd8f2b51ca7c07a02412d6a9d" exitCode=0 Mar 19 20:00:02 crc kubenswrapper[5033]: I0319 20:00:02.196526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" event={"ID":"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad","Type":"ContainerDied","Data":"4263d185eb78e1c9cba1b2fe69625cc49d8ec4ebd8f2b51ca7c07a02412d6a9d"} Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.605136 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.770407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume\") pod \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.770597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume\") pod \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.770649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrbs6\" (UniqueName: \"kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6\") pod \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\" (UID: \"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad\") " Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.771665 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" (UID: "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.777322 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" (UID: "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.778680 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6" (OuterVolumeSpecName: "kube-api-access-lrbs6") pod "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" (UID: "2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad"). InnerVolumeSpecName "kube-api-access-lrbs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.873039 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrbs6\" (UniqueName: \"kubernetes.io/projected/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-kube-api-access-lrbs6\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.873080 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:03 crc kubenswrapper[5033]: I0319 20:00:03.873091 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.220821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" event={"ID":"2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad","Type":"ContainerDied","Data":"bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b"} Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.221138 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce638bb59c855815c3f7cd7c4f98b697989f65ebc2a9c4b12bcbd201eb08a5b" Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.221252 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-29xtt" Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.284895 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf"] Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.293620 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-78mtf"] Mar 19 20:00:04 crc kubenswrapper[5033]: I0319 20:00:04.636431 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962b3cc1-defe-429e-a24c-68c3bec382fe" path="/var/lib/kubelet/pods/962b3cc1-defe-429e-a24c-68c3bec382fe/volumes" Mar 19 20:00:12 crc kubenswrapper[5033]: I0319 20:00:12.319231 5033 generic.go:334] "Generic (PLEG): container finished" podID="28f37a5b-e76e-4a89-90f7-33796c755dc5" containerID="29b44fc8ea46a35a85cf617b715fc46ec4de743c78cdecefd553a69642fd7c1c" exitCode=0 Mar 19 20:00:12 crc kubenswrapper[5033]: I0319 20:00:12.319305 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" event={"ID":"28f37a5b-e76e-4a89-90f7-33796c755dc5","Type":"ContainerDied","Data":"29b44fc8ea46a35a85cf617b715fc46ec4de743c78cdecefd553a69642fd7c1c"} Mar 19 20:00:13 crc kubenswrapper[5033]: I0319 20:00:13.687220 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:13 crc kubenswrapper[5033]: I0319 20:00:13.790102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvp2h\" (UniqueName: \"kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h\") pod \"28f37a5b-e76e-4a89-90f7-33796c755dc5\" (UID: \"28f37a5b-e76e-4a89-90f7-33796c755dc5\") " Mar 19 20:00:13 crc kubenswrapper[5033]: I0319 20:00:13.796079 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h" (OuterVolumeSpecName: "kube-api-access-qvp2h") pod "28f37a5b-e76e-4a89-90f7-33796c755dc5" (UID: "28f37a5b-e76e-4a89-90f7-33796c755dc5"). InnerVolumeSpecName "kube-api-access-qvp2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:00:13 crc kubenswrapper[5033]: I0319 20:00:13.892818 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvp2h\" (UniqueName: \"kubernetes.io/projected/28f37a5b-e76e-4a89-90f7-33796c755dc5-kube-api-access-qvp2h\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:14 crc kubenswrapper[5033]: I0319 20:00:14.343016 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" event={"ID":"28f37a5b-e76e-4a89-90f7-33796c755dc5","Type":"ContainerDied","Data":"05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42"} Mar 19 20:00:14 crc kubenswrapper[5033]: I0319 20:00:14.343059 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05dce9b2274f1fab0236e3afaf0748eec0b248aff2ff66e377c8ef3f5353ff42" Mar 19 20:00:14 crc kubenswrapper[5033]: I0319 20:00:14.343071 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-8cxkh" Mar 19 20:00:14 crc kubenswrapper[5033]: I0319 20:00:14.746002 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-rdp8m"] Mar 19 20:00:14 crc kubenswrapper[5033]: I0319 20:00:14.763034 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-rdp8m"] Mar 19 20:00:16 crc kubenswrapper[5033]: I0319 20:00:16.629873 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245ffa8e-4d63-4f15-be05-f0f3fd6cecbe" path="/var/lib/kubelet/pods/245ffa8e-4d63-4f15-be05-f0f3fd6cecbe/volumes" Mar 19 20:00:18 crc kubenswrapper[5033]: I0319 20:00:18.335565 5033 scope.go:117] "RemoveContainer" containerID="3afcd07acd026610888aa77b0856e98a5304695c9c0118ef14986b1b3985c77b" Mar 19 20:00:18 crc kubenswrapper[5033]: I0319 20:00:18.814534 5033 scope.go:117] "RemoveContainer" containerID="f5c65fc239aa141290d587868c68e28894830eccf2b2950cd5d9df1831661065" Mar 19 20:00:40 crc kubenswrapper[5033]: I0319 20:00:40.758429 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:00:40 crc kubenswrapper[5033]: I0319 20:00:40.758974 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.152190 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565841-b6rqs"] Mar 19 20:01:00 crc kubenswrapper[5033]: E0319 20:01:00.153126 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f37a5b-e76e-4a89-90f7-33796c755dc5" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.153141 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f37a5b-e76e-4a89-90f7-33796c755dc5" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[5033]: E0319 20:01:00.153178 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.153183 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.153390 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f37a5b-e76e-4a89-90f7-33796c755dc5" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.153404 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da23f6d-d8c0-41c1-8eff-4ddeb4f1b2ad" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.154157 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.165394 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565841-b6rqs"] Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.281693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.282006 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.282130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.282229 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvwm5\" (UniqueName: \"kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.383980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvwm5\" (UniqueName: \"kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.384191 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.384237 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.384268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.392314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.392910 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.393325 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.410723 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvwm5\" (UniqueName: \"kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5\") pod \"keystone-cron-29565841-b6rqs\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:00 crc kubenswrapper[5033]: I0319 20:01:00.473483 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:01 crc kubenswrapper[5033]: I0319 20:01:01.099538 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565841-b6rqs"] Mar 19 20:01:01 crc kubenswrapper[5033]: W0319 20:01:01.110855 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5cc708_ac76_4f3a_baed_1d2791ce807a.slice/crio-15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96 WatchSource:0}: Error finding container 15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96: Status 404 returned error can't find the container with id 15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96 Mar 19 20:01:01 crc kubenswrapper[5033]: I0319 20:01:01.825324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-b6rqs" event={"ID":"0e5cc708-ac76-4f3a-baed-1d2791ce807a","Type":"ContainerStarted","Data":"0ab4767b958993b2b455d591f4cdceca96530b99fede1a92b8a3f86b4ad3f89e"} Mar 19 20:01:01 crc kubenswrapper[5033]: I0319 20:01:01.825775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-b6rqs" event={"ID":"0e5cc708-ac76-4f3a-baed-1d2791ce807a","Type":"ContainerStarted","Data":"15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96"} Mar 19 20:01:05 crc kubenswrapper[5033]: I0319 20:01:05.860138 5033 generic.go:334] "Generic (PLEG): container finished" podID="0e5cc708-ac76-4f3a-baed-1d2791ce807a" containerID="0ab4767b958993b2b455d591f4cdceca96530b99fede1a92b8a3f86b4ad3f89e" exitCode=0 Mar 19 20:01:05 crc kubenswrapper[5033]: I0319 20:01:05.860221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-b6rqs" event={"ID":"0e5cc708-ac76-4f3a-baed-1d2791ce807a","Type":"ContainerDied","Data":"0ab4767b958993b2b455d591f4cdceca96530b99fede1a92b8a3f86b4ad3f89e"} Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.307073 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.437102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvwm5\" (UniqueName: \"kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5\") pod \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.437627 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys\") pod \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.437695 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data\") pod \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.437914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle\") pod \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\" (UID: \"0e5cc708-ac76-4f3a-baed-1d2791ce807a\") " Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.447054 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5" (OuterVolumeSpecName: "kube-api-access-jvwm5") pod "0e5cc708-ac76-4f3a-baed-1d2791ce807a" (UID: "0e5cc708-ac76-4f3a-baed-1d2791ce807a"). InnerVolumeSpecName "kube-api-access-jvwm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.447418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0e5cc708-ac76-4f3a-baed-1d2791ce807a" (UID: "0e5cc708-ac76-4f3a-baed-1d2791ce807a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.473044 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e5cc708-ac76-4f3a-baed-1d2791ce807a" (UID: "0e5cc708-ac76-4f3a-baed-1d2791ce807a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.518005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data" (OuterVolumeSpecName: "config-data") pod "0e5cc708-ac76-4f3a-baed-1d2791ce807a" (UID: "0e5cc708-ac76-4f3a-baed-1d2791ce807a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.540523 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.540552 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvwm5\" (UniqueName: \"kubernetes.io/projected/0e5cc708-ac76-4f3a-baed-1d2791ce807a-kube-api-access-jvwm5\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.540562 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.540570 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5cc708-ac76-4f3a-baed-1d2791ce807a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.878551 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-b6rqs" event={"ID":"0e5cc708-ac76-4f3a-baed-1d2791ce807a","Type":"ContainerDied","Data":"15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96"} Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.878804 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15d979a0f730ee1e893e39af46189573b782bf7e5103f56e693e3ca2e427cb96" Mar 19 20:01:07 crc kubenswrapper[5033]: I0319 20:01:07.878624 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-b6rqs" Mar 19 20:01:10 crc kubenswrapper[5033]: I0319 20:01:10.759206 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:01:10 crc kubenswrapper[5033]: I0319 20:01:10.759704 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.179319 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:20 crc kubenswrapper[5033]: E0319 20:01:20.180658 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5cc708-ac76-4f3a-baed-1d2791ce807a" containerName="keystone-cron" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.180677 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5cc708-ac76-4f3a-baed-1d2791ce807a" containerName="keystone-cron" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.180949 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5cc708-ac76-4f3a-baed-1d2791ce807a" containerName="keystone-cron" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.182853 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.203381 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.297255 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.297495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26k7\" (UniqueName: \"kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.297584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.399164 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.399286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k26k7\" (UniqueName: \"kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.399331 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.399662 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.399693 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.418247 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26k7\" (UniqueName: \"kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7\") pod \"redhat-operators-rzw92\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.504959 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:20 crc kubenswrapper[5033]: I0319 20:01:20.977543 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:21 crc kubenswrapper[5033]: I0319 20:01:21.013793 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerStarted","Data":"c69afe4204686741c31d85ad69d6b9aafaf756b0fcd2942c50ba8ad7879a5629"} Mar 19 20:01:22 crc kubenswrapper[5033]: I0319 20:01:22.025334 5033 generic.go:334] "Generic (PLEG): container finished" podID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerID="2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495" exitCode=0 Mar 19 20:01:22 crc kubenswrapper[5033]: I0319 20:01:22.025486 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerDied","Data":"2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495"} Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.769762 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.774307 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.794947 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.875709 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.875969 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.876021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps885\" (UniqueName: \"kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.977839 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.977909 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps885\" (UniqueName: \"kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.977998 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.978526 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.978529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:23 crc kubenswrapper[5033]: I0319 20:01:23.999046 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps885\" (UniqueName: \"kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885\") pod \"community-operators-4pp58\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:24 crc kubenswrapper[5033]: I0319 20:01:24.044343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerStarted","Data":"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125"} Mar 19 20:01:24 crc kubenswrapper[5033]: I0319 20:01:24.099311 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:24 crc kubenswrapper[5033]: I0319 20:01:24.648440 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:25 crc kubenswrapper[5033]: I0319 20:01:25.055260 5033 generic.go:334] "Generic (PLEG): container finished" podID="1cd93123-cc97-4e82-9607-08107f267773" containerID="80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad" exitCode=0 Mar 19 20:01:25 crc kubenswrapper[5033]: I0319 20:01:25.055340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerDied","Data":"80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad"} Mar 19 20:01:25 crc kubenswrapper[5033]: I0319 20:01:25.055650 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerStarted","Data":"18804eb0f376b8cc588b1f7d7bc0cc7b7802118f6f82895546747bb420eeaee0"} Mar 19 20:01:26 crc kubenswrapper[5033]: I0319 20:01:26.067078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerStarted","Data":"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8"} Mar 19 20:01:29 crc kubenswrapper[5033]: I0319 20:01:29.098656 5033 generic.go:334] "Generic (PLEG): container finished" podID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerID="a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125" exitCode=0 Mar 19 20:01:29 crc kubenswrapper[5033]: I0319 20:01:29.098749 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerDied","Data":"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125"} Mar 19 20:01:29 crc kubenswrapper[5033]: I0319 20:01:29.102394 5033 generic.go:334] "Generic (PLEG): container finished" podID="1cd93123-cc97-4e82-9607-08107f267773" containerID="6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8" exitCode=0 Mar 19 20:01:29 crc kubenswrapper[5033]: I0319 20:01:29.102441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerDied","Data":"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8"} Mar 19 20:01:31 crc kubenswrapper[5033]: I0319 20:01:31.123177 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerStarted","Data":"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c"} Mar 19 20:01:31 crc kubenswrapper[5033]: I0319 20:01:31.125596 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerStarted","Data":"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa"} Mar 19 20:01:31 crc kubenswrapper[5033]: I0319 20:01:31.145273 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pp58" podStartSLOduration=3.234321207 podStartE2EDuration="8.145257547s" podCreationTimestamp="2026-03-19 20:01:23 +0000 UTC" firstStartedPulling="2026-03-19 20:01:25.056981613 +0000 UTC m=+3895.162011462" lastFinishedPulling="2026-03-19 20:01:29.967917953 +0000 UTC m=+3900.072947802" observedRunningTime="2026-03-19 20:01:31.140184973 +0000 UTC m=+3901.245214822" watchObservedRunningTime="2026-03-19 20:01:31.145257547 +0000 UTC m=+3901.250287396" Mar 19 20:01:31 crc kubenswrapper[5033]: I0319 20:01:31.157467 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzw92" podStartSLOduration=3.155635284 podStartE2EDuration="11.15742814s" podCreationTimestamp="2026-03-19 20:01:20 +0000 UTC" firstStartedPulling="2026-03-19 20:01:22.027896631 +0000 UTC m=+3892.132926480" lastFinishedPulling="2026-03-19 20:01:30.029689477 +0000 UTC m=+3900.134719336" observedRunningTime="2026-03-19 20:01:31.156438262 +0000 UTC m=+3901.261468111" watchObservedRunningTime="2026-03-19 20:01:31.15742814 +0000 UTC m=+3901.262457989" Mar 19 20:01:34 crc kubenswrapper[5033]: I0319 20:01:34.099892 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:34 crc kubenswrapper[5033]: I0319 20:01:34.100396 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:34 crc kubenswrapper[5033]: I0319 20:01:34.229428 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.506070 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.506883 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.568302 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.759023 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.759096 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.759140 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.759993 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:01:40 crc kubenswrapper[5033]: I0319 20:01:40.760059 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" gracePeriod=600 Mar 19 20:01:40 crc kubenswrapper[5033]: E0319 20:01:40.888648 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.221717 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" exitCode=0 Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.221807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1"} Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.221866 5033 scope.go:117] "RemoveContainer" containerID="fd5d3324e07348c90efc43ea200a4b90e59cf1260a8d3cc4bfdc68d90328c841" Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.222671 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:01:41 crc kubenswrapper[5033]: E0319 20:01:41.223280 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.280224 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:41 crc kubenswrapper[5033]: I0319 20:01:41.349865 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.238300 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzw92" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="registry-server" containerID="cri-o://8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa" gracePeriod=2 Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.799909 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.893188 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content\") pod \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.893244 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities\") pod \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.893504 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k26k7\" (UniqueName: \"kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7\") pod \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\" (UID: \"88cc06c7-e6e3-470e-a3e4-310baf3b17df\") " Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.894092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities" (OuterVolumeSpecName: "utilities") pod "88cc06c7-e6e3-470e-a3e4-310baf3b17df" (UID: "88cc06c7-e6e3-470e-a3e4-310baf3b17df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.900395 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7" (OuterVolumeSpecName: "kube-api-access-k26k7") pod "88cc06c7-e6e3-470e-a3e4-310baf3b17df" (UID: "88cc06c7-e6e3-470e-a3e4-310baf3b17df"). InnerVolumeSpecName "kube-api-access-k26k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.996479 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k26k7\" (UniqueName: \"kubernetes.io/projected/88cc06c7-e6e3-470e-a3e4-310baf3b17df-kube-api-access-k26k7\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:43 crc kubenswrapper[5033]: I0319 20:01:43.996516 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.031764 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88cc06c7-e6e3-470e-a3e4-310baf3b17df" (UID: "88cc06c7-e6e3-470e-a3e4-310baf3b17df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.099097 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cc06c7-e6e3-470e-a3e4-310baf3b17df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.161072 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.217990 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251165 5033 generic.go:334] "Generic (PLEG): container finished" podID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerID="8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa" exitCode=0 Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251260 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzw92" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251269 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerDied","Data":"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa"} Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251353 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzw92" event={"ID":"88cc06c7-e6e3-470e-a3e4-310baf3b17df","Type":"ContainerDied","Data":"c69afe4204686741c31d85ad69d6b9aafaf756b0fcd2942c50ba8ad7879a5629"} Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251384 5033 scope.go:117] "RemoveContainer" containerID="8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.251393 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4pp58" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="registry-server" containerID="cri-o://7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c" gracePeriod=2 Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.280487 5033 scope.go:117] "RemoveContainer" containerID="a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.297130 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.306280 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzw92"] Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.310687 5033 scope.go:117] "RemoveContainer" containerID="2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.468651 5033 scope.go:117] "RemoveContainer" containerID="8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa" Mar 19 20:01:44 crc kubenswrapper[5033]: E0319 20:01:44.470667 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa\": container with ID starting with 8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa not found: ID does not exist" containerID="8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.470703 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa"} err="failed to get container status \"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa\": rpc error: code = NotFound desc = could not find container \"8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa\": container with ID starting with 8bfe3a030fefa6503404df52ec582bf86692ab642f2f3fa8c4710cad107c1afa not found: ID does not exist" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.470726 5033 scope.go:117] "RemoveContainer" containerID="a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125" Mar 19 20:01:44 crc kubenswrapper[5033]: E0319 20:01:44.475070 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125\": container with ID starting with a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125 not found: ID does not exist" containerID="a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.475119 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125"} err="failed to get container status \"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125\": rpc error: code = NotFound desc = could not find container \"a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125\": container with ID starting with a1bbf621db7fb151d25359665496f16394d9019f497e61b7c688b6286b3ec125 not found: ID does not exist" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.475148 5033 scope.go:117] "RemoveContainer" containerID="2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495" Mar 19 20:01:44 crc kubenswrapper[5033]: E0319 20:01:44.475752 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495\": container with ID starting with 2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495 not found: ID does not exist" containerID="2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.475821 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495"} err="failed to get container status \"2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495\": rpc error: code = NotFound desc = could not find container \"2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495\": container with ID starting with 2a9ab0ce70d40e935845718d864a2e4e9efc2871e0a72c7ce200f945d1151495 not found: ID does not exist" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.632976 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" path="/var/lib/kubelet/pods/88cc06c7-e6e3-470e-a3e4-310baf3b17df/volumes" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.762970 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.812460 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps885\" (UniqueName: \"kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885\") pod \"1cd93123-cc97-4e82-9607-08107f267773\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.812510 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content\") pod \"1cd93123-cc97-4e82-9607-08107f267773\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.812903 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities\") pod \"1cd93123-cc97-4e82-9607-08107f267773\" (UID: \"1cd93123-cc97-4e82-9607-08107f267773\") " Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.814822 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities" (OuterVolumeSpecName: "utilities") pod "1cd93123-cc97-4e82-9607-08107f267773" (UID: "1cd93123-cc97-4e82-9607-08107f267773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.817312 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885" (OuterVolumeSpecName: "kube-api-access-ps885") pod "1cd93123-cc97-4e82-9607-08107f267773" (UID: "1cd93123-cc97-4e82-9607-08107f267773"). InnerVolumeSpecName "kube-api-access-ps885". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.914797 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.914836 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps885\" (UniqueName: \"kubernetes.io/projected/1cd93123-cc97-4e82-9607-08107f267773-kube-api-access-ps885\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:44 crc kubenswrapper[5033]: I0319 20:01:44.933709 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd93123-cc97-4e82-9607-08107f267773" (UID: "1cd93123-cc97-4e82-9607-08107f267773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.016842 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd93123-cc97-4e82-9607-08107f267773-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.263281 5033 generic.go:334] "Generic (PLEG): container finished" podID="1cd93123-cc97-4e82-9607-08107f267773" containerID="7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c" exitCode=0 Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.263340 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pp58" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.263370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerDied","Data":"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c"} Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.263420 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pp58" event={"ID":"1cd93123-cc97-4e82-9607-08107f267773","Type":"ContainerDied","Data":"18804eb0f376b8cc588b1f7d7bc0cc7b7802118f6f82895546747bb420eeaee0"} Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.263441 5033 scope.go:117] "RemoveContainer" containerID="7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.283240 5033 scope.go:117] "RemoveContainer" containerID="6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.299063 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.307764 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4pp58"] Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.322942 5033 scope.go:117] "RemoveContainer" containerID="80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.342578 5033 scope.go:117] "RemoveContainer" containerID="7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c" Mar 19 20:01:45 crc kubenswrapper[5033]: E0319 20:01:45.343185 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c\": container with ID starting with 7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c not found: ID does not exist" containerID="7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.343246 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c"} err="failed to get container status \"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c\": rpc error: code = NotFound desc = could not find container \"7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c\": container with ID starting with 7772ce1015f692e3664a7361a73e7c259f585799af41f14de8a775ee218f609c not found: ID does not exist" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.343280 5033 scope.go:117] "RemoveContainer" containerID="6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8" Mar 19 20:01:45 crc kubenswrapper[5033]: E0319 20:01:45.343916 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8\": container with ID starting with 6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8 not found: ID does not exist" containerID="6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.343968 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8"} err="failed to get container status \"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8\": rpc error: code = NotFound desc = could not find container \"6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8\": container with ID starting with 6315351d32b06cc36653f3440231acd7709da54cc9e1e20feb1a82cfe384aab8 not found: ID does not exist" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.344002 5033 scope.go:117] "RemoveContainer" containerID="80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad" Mar 19 20:01:45 crc kubenswrapper[5033]: E0319 20:01:45.344334 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad\": container with ID starting with 80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad not found: ID does not exist" containerID="80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad" Mar 19 20:01:45 crc kubenswrapper[5033]: I0319 20:01:45.344385 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad"} err="failed to get container status \"80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad\": rpc error: code = NotFound desc = could not find container \"80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad\": container with ID starting with 80837273d901f55dadf456f44cad1f53ccb1bf1a229a416b0f286d5cc66d0cad not found: ID does not exist" Mar 19 20:01:46 crc kubenswrapper[5033]: I0319 20:01:46.637613 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd93123-cc97-4e82-9607-08107f267773" path="/var/lib/kubelet/pods/1cd93123-cc97-4e82-9607-08107f267773/volumes" Mar 19 20:01:56 crc kubenswrapper[5033]: I0319 20:01:56.620703 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:01:56 crc kubenswrapper[5033]: E0319 20:01:56.622784 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.141044 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565842-sb8sk"] Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142006 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142019 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142027 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142036 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142050 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142057 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142083 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142088 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142099 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142105 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[5033]: E0319 20:02:00.142119 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142125 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142310 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd93123-cc97-4e82-9607-08107f267773" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.142325 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cc06c7-e6e3-470e-a3e4-310baf3b17df" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.143147 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.146616 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.146684 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.146999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.163071 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-sb8sk"] Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.246697 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2sfp\" (UniqueName: \"kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp\") pod \"auto-csr-approver-29565842-sb8sk\" (UID: \"97c653c9-a189-4a7d-a694-4c53306c0c4e\") " pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.349659 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2sfp\" (UniqueName: \"kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp\") pod \"auto-csr-approver-29565842-sb8sk\" (UID: \"97c653c9-a189-4a7d-a694-4c53306c0c4e\") " pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.370121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2sfp\" (UniqueName: \"kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp\") pod \"auto-csr-approver-29565842-sb8sk\" (UID: \"97c653c9-a189-4a7d-a694-4c53306c0c4e\") " pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.462833 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:00 crc kubenswrapper[5033]: I0319 20:02:00.926601 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-sb8sk"] Mar 19 20:02:01 crc kubenswrapper[5033]: I0319 20:02:01.444211 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" event={"ID":"97c653c9-a189-4a7d-a694-4c53306c0c4e","Type":"ContainerStarted","Data":"21f344e29a3933a8aa9652d7f0bad7c78e74259f5b6ef187a3910608f26887b5"} Mar 19 20:02:02 crc kubenswrapper[5033]: I0319 20:02:02.454713 5033 generic.go:334] "Generic (PLEG): container finished" podID="97c653c9-a189-4a7d-a694-4c53306c0c4e" containerID="3c8d521f51d724190ba9d91ba168ba3b48c0de14adea2d4738d5b050e348ca13" exitCode=0 Mar 19 20:02:02 crc kubenswrapper[5033]: I0319 20:02:02.454771 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" event={"ID":"97c653c9-a189-4a7d-a694-4c53306c0c4e","Type":"ContainerDied","Data":"3c8d521f51d724190ba9d91ba168ba3b48c0de14adea2d4738d5b050e348ca13"} Mar 19 20:02:03 crc kubenswrapper[5033]: I0319 20:02:03.968752 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.132654 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2sfp\" (UniqueName: \"kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp\") pod \"97c653c9-a189-4a7d-a694-4c53306c0c4e\" (UID: \"97c653c9-a189-4a7d-a694-4c53306c0c4e\") " Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.139053 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp" (OuterVolumeSpecName: "kube-api-access-c2sfp") pod "97c653c9-a189-4a7d-a694-4c53306c0c4e" (UID: "97c653c9-a189-4a7d-a694-4c53306c0c4e"). InnerVolumeSpecName "kube-api-access-c2sfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.235537 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2sfp\" (UniqueName: \"kubernetes.io/projected/97c653c9-a189-4a7d-a694-4c53306c0c4e-kube-api-access-c2sfp\") on node \"crc\" DevicePath \"\"" Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.475530 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" event={"ID":"97c653c9-a189-4a7d-a694-4c53306c0c4e","Type":"ContainerDied","Data":"21f344e29a3933a8aa9652d7f0bad7c78e74259f5b6ef187a3910608f26887b5"} Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.475574 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f344e29a3933a8aa9652d7f0bad7c78e74259f5b6ef187a3910608f26887b5" Mar 19 20:02:04 crc kubenswrapper[5033]: I0319 20:02:04.475599 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-sb8sk" Mar 19 20:02:05 crc kubenswrapper[5033]: I0319 20:02:05.066588 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-xhwbf"] Mar 19 20:02:05 crc kubenswrapper[5033]: I0319 20:02:05.079846 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-xhwbf"] Mar 19 20:02:06 crc kubenswrapper[5033]: I0319 20:02:06.631959 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a7b950-c618-442c-922d-f0c6f91463cb" path="/var/lib/kubelet/pods/27a7b950-c618-442c-922d-f0c6f91463cb/volumes" Mar 19 20:02:10 crc kubenswrapper[5033]: I0319 20:02:10.626936 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:02:10 crc kubenswrapper[5033]: E0319 20:02:10.627645 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:02:19 crc kubenswrapper[5033]: I0319 20:02:19.045415 5033 scope.go:117] "RemoveContainer" containerID="d9bf45b322e01e033941d9c6618511fc87d7cc01b3d40a18a86a87889ed15d2f" Mar 19 20:02:23 crc kubenswrapper[5033]: I0319 20:02:23.620894 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:02:23 crc kubenswrapper[5033]: E0319 20:02:23.621602 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:02:38 crc kubenswrapper[5033]: I0319 20:02:38.620520 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:02:38 crc kubenswrapper[5033]: E0319 20:02:38.621268 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:02:52 crc kubenswrapper[5033]: I0319 20:02:52.620916 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:02:52 crc kubenswrapper[5033]: E0319 20:02:52.622246 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:03:04 crc kubenswrapper[5033]: I0319 20:03:04.621068 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:03:04 crc kubenswrapper[5033]: E0319 20:03:04.622077 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:03:19 crc kubenswrapper[5033]: I0319 20:03:19.621234 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:03:19 crc kubenswrapper[5033]: E0319 20:03:19.622251 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:03:32 crc kubenswrapper[5033]: I0319 20:03:32.621063 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:03:32 crc kubenswrapper[5033]: E0319 20:03:32.622183 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:03:47 crc kubenswrapper[5033]: I0319 20:03:47.620912 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:03:47 crc kubenswrapper[5033]: E0319 20:03:47.622065 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:03:58 crc kubenswrapper[5033]: I0319 20:03:58.621285 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:03:58 crc kubenswrapper[5033]: E0319 20:03:58.622384 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.153843 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565844-n2r2j"] Mar 19 20:04:00 crc kubenswrapper[5033]: E0319 20:04:00.154733 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c653c9-a189-4a7d-a694-4c53306c0c4e" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.154752 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c653c9-a189-4a7d-a694-4c53306c0c4e" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.155028 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c653c9-a189-4a7d-a694-4c53306c0c4e" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.156051 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.162067 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.162291 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.162988 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.167748 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-n2r2j"] Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.248400 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rw55\" (UniqueName: \"kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55\") pod \"auto-csr-approver-29565844-n2r2j\" (UID: \"4ddc6918-5028-4cfe-bc1e-7e92def5be70\") " pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.351199 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rw55\" (UniqueName: \"kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55\") pod \"auto-csr-approver-29565844-n2r2j\" (UID: \"4ddc6918-5028-4cfe-bc1e-7e92def5be70\") " pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.374646 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rw55\" (UniqueName: \"kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55\") pod \"auto-csr-approver-29565844-n2r2j\" (UID: \"4ddc6918-5028-4cfe-bc1e-7e92def5be70\") " pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.477121 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:00 crc kubenswrapper[5033]: I0319 20:04:00.945490 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-n2r2j"] Mar 19 20:04:01 crc kubenswrapper[5033]: I0319 20:04:01.241639 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" event={"ID":"4ddc6918-5028-4cfe-bc1e-7e92def5be70","Type":"ContainerStarted","Data":"c7629d74b317c36d211c4391de412a33cc9d63b9ed4ce24f9d4d2227386b3303"} Mar 19 20:04:02 crc kubenswrapper[5033]: I0319 20:04:02.252530 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" event={"ID":"4ddc6918-5028-4cfe-bc1e-7e92def5be70","Type":"ContainerStarted","Data":"6880cfaf3c72c45e4f1d52f107d3f1b9980ec06d019880299ff072455731d042"} Mar 19 20:04:02 crc kubenswrapper[5033]: I0319 20:04:02.273783 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" podStartSLOduration=1.331505821 podStartE2EDuration="2.273763352s" podCreationTimestamp="2026-03-19 20:04:00 +0000 UTC" firstStartedPulling="2026-03-19 20:04:00.942866974 +0000 UTC m=+4051.047896823" lastFinishedPulling="2026-03-19 20:04:01.885124495 +0000 UTC m=+4051.990154354" observedRunningTime="2026-03-19 20:04:02.265107288 +0000 UTC m=+4052.370137177" watchObservedRunningTime="2026-03-19 20:04:02.273763352 +0000 UTC m=+4052.378793201" Mar 19 20:04:03 crc kubenswrapper[5033]: I0319 20:04:03.265508 5033 generic.go:334] "Generic (PLEG): container finished" podID="4ddc6918-5028-4cfe-bc1e-7e92def5be70" containerID="6880cfaf3c72c45e4f1d52f107d3f1b9980ec06d019880299ff072455731d042" exitCode=0 Mar 19 20:04:03 crc kubenswrapper[5033]: I0319 20:04:03.265619 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" event={"ID":"4ddc6918-5028-4cfe-bc1e-7e92def5be70","Type":"ContainerDied","Data":"6880cfaf3c72c45e4f1d52f107d3f1b9980ec06d019880299ff072455731d042"} Mar 19 20:04:04 crc kubenswrapper[5033]: I0319 20:04:04.790057 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:04 crc kubenswrapper[5033]: I0319 20:04:04.846943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rw55\" (UniqueName: \"kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55\") pod \"4ddc6918-5028-4cfe-bc1e-7e92def5be70\" (UID: \"4ddc6918-5028-4cfe-bc1e-7e92def5be70\") " Mar 19 20:04:04 crc kubenswrapper[5033]: I0319 20:04:04.854824 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55" (OuterVolumeSpecName: "kube-api-access-9rw55") pod "4ddc6918-5028-4cfe-bc1e-7e92def5be70" (UID: "4ddc6918-5028-4cfe-bc1e-7e92def5be70"). InnerVolumeSpecName "kube-api-access-9rw55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:04:04 crc kubenswrapper[5033]: I0319 20:04:04.950198 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rw55\" (UniqueName: \"kubernetes.io/projected/4ddc6918-5028-4cfe-bc1e-7e92def5be70-kube-api-access-9rw55\") on node \"crc\" DevicePath \"\"" Mar 19 20:04:05 crc kubenswrapper[5033]: I0319 20:04:05.300985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" event={"ID":"4ddc6918-5028-4cfe-bc1e-7e92def5be70","Type":"ContainerDied","Data":"c7629d74b317c36d211c4391de412a33cc9d63b9ed4ce24f9d4d2227386b3303"} Mar 19 20:04:05 crc kubenswrapper[5033]: I0319 20:04:05.301039 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7629d74b317c36d211c4391de412a33cc9d63b9ed4ce24f9d4d2227386b3303" Mar 19 20:04:05 crc kubenswrapper[5033]: I0319 20:04:05.301589 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-n2r2j" Mar 19 20:04:05 crc kubenswrapper[5033]: I0319 20:04:05.350350 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-7fqvw"] Mar 19 20:04:05 crc kubenswrapper[5033]: I0319 20:04:05.358580 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-7fqvw"] Mar 19 20:04:06 crc kubenswrapper[5033]: I0319 20:04:06.634011 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d57273-84e1-4d47-a881-11916cf47217" path="/var/lib/kubelet/pods/09d57273-84e1-4d47-a881-11916cf47217/volumes" Mar 19 20:04:11 crc kubenswrapper[5033]: I0319 20:04:11.620640 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:04:11 crc kubenswrapper[5033]: E0319 20:04:11.622890 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:04:19 crc kubenswrapper[5033]: I0319 20:04:19.215869 5033 scope.go:117] "RemoveContainer" containerID="11c67242ce89603f04863dcf7fc3e4da85746d9f8128b352d583e05336ab73d7" Mar 19 20:04:23 crc kubenswrapper[5033]: I0319 20:04:23.621341 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:04:23 crc kubenswrapper[5033]: E0319 20:04:23.626032 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:04:35 crc kubenswrapper[5033]: I0319 20:04:35.621741 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:04:35 crc kubenswrapper[5033]: E0319 20:04:35.622741 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:04:50 crc kubenswrapper[5033]: I0319 20:04:50.630083 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:04:50 crc kubenswrapper[5033]: E0319 20:04:50.630909 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:05:05 crc kubenswrapper[5033]: I0319 20:05:05.620268 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:05:05 crc kubenswrapper[5033]: E0319 20:05:05.621249 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:05:19 crc kubenswrapper[5033]: I0319 20:05:19.620811 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:05:19 crc kubenswrapper[5033]: E0319 20:05:19.621926 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:05:32 crc kubenswrapper[5033]: I0319 20:05:32.620345 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:05:32 crc kubenswrapper[5033]: E0319 20:05:32.621032 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:05:47 crc kubenswrapper[5033]: I0319 20:05:47.621746 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:05:47 crc kubenswrapper[5033]: E0319 20:05:47.622964 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.173314 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565846-ks689"] Mar 19 20:06:00 crc kubenswrapper[5033]: E0319 20:06:00.175188 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddc6918-5028-4cfe-bc1e-7e92def5be70" containerName="oc" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.175217 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddc6918-5028-4cfe-bc1e-7e92def5be70" containerName="oc" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.175799 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddc6918-5028-4cfe-bc1e-7e92def5be70" containerName="oc" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.177527 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.180446 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.180830 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.181241 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.204641 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-ks689"] Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.236491 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6gm\" (UniqueName: \"kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm\") pod \"auto-csr-approver-29565846-ks689\" (UID: \"6483248d-cb85-48f6-a317-d34866702c5c\") " pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.339498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6gm\" (UniqueName: \"kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm\") pod \"auto-csr-approver-29565846-ks689\" (UID: \"6483248d-cb85-48f6-a317-d34866702c5c\") " pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.364231 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6gm\" (UniqueName: \"kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm\") pod \"auto-csr-approver-29565846-ks689\" (UID: \"6483248d-cb85-48f6-a317-d34866702c5c\") " pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:00 crc kubenswrapper[5033]: I0319 20:06:00.503963 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:01 crc kubenswrapper[5033]: I0319 20:06:01.023724 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-ks689"] Mar 19 20:06:01 crc kubenswrapper[5033]: I0319 20:06:01.596398 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:06:01 crc kubenswrapper[5033]: I0319 20:06:01.621862 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:06:01 crc kubenswrapper[5033]: E0319 20:06:01.622546 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:06:02 crc kubenswrapper[5033]: I0319 20:06:02.517260 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-ks689" event={"ID":"6483248d-cb85-48f6-a317-d34866702c5c","Type":"ContainerStarted","Data":"ee859b78a958d3cd2543b7a5aa82eaefb03a0560cfc58a50bb74be6cba7fcf9f"} Mar 19 20:06:03 crc kubenswrapper[5033]: I0319 20:06:03.533834 5033 generic.go:334] "Generic (PLEG): container finished" podID="6483248d-cb85-48f6-a317-d34866702c5c" containerID="3b6c80295b7e10f3feeea18a336dd21df48982db2cc067fc70aeedecc11d38ae" exitCode=0 Mar 19 20:06:03 crc kubenswrapper[5033]: I0319 20:06:03.534003 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-ks689" event={"ID":"6483248d-cb85-48f6-a317-d34866702c5c","Type":"ContainerDied","Data":"3b6c80295b7e10f3feeea18a336dd21df48982db2cc067fc70aeedecc11d38ae"} Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.080114 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.260955 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6gm\" (UniqueName: \"kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm\") pod \"6483248d-cb85-48f6-a317-d34866702c5c\" (UID: \"6483248d-cb85-48f6-a317-d34866702c5c\") " Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.269003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm" (OuterVolumeSpecName: "kube-api-access-xg6gm") pod "6483248d-cb85-48f6-a317-d34866702c5c" (UID: "6483248d-cb85-48f6-a317-d34866702c5c"). InnerVolumeSpecName "kube-api-access-xg6gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.364227 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6gm\" (UniqueName: \"kubernetes.io/projected/6483248d-cb85-48f6-a317-d34866702c5c-kube-api-access-xg6gm\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.561536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-ks689" event={"ID":"6483248d-cb85-48f6-a317-d34866702c5c","Type":"ContainerDied","Data":"ee859b78a958d3cd2543b7a5aa82eaefb03a0560cfc58a50bb74be6cba7fcf9f"} Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.561588 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee859b78a958d3cd2543b7a5aa82eaefb03a0560cfc58a50bb74be6cba7fcf9f" Mar 19 20:06:05 crc kubenswrapper[5033]: I0319 20:06:05.561627 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-ks689" Mar 19 20:06:06 crc kubenswrapper[5033]: I0319 20:06:06.173628 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-8cxkh"] Mar 19 20:06:06 crc kubenswrapper[5033]: I0319 20:06:06.183662 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-8cxkh"] Mar 19 20:06:06 crc kubenswrapper[5033]: I0319 20:06:06.648047 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f37a5b-e76e-4a89-90f7-33796c755dc5" path="/var/lib/kubelet/pods/28f37a5b-e76e-4a89-90f7-33796c755dc5/volumes" Mar 19 20:06:07 crc kubenswrapper[5033]: I0319 20:06:07.987807 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:07 crc kubenswrapper[5033]: E0319 20:06:07.989812 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6483248d-cb85-48f6-a317-d34866702c5c" containerName="oc" Mar 19 20:06:07 crc kubenswrapper[5033]: I0319 20:06:07.989925 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6483248d-cb85-48f6-a317-d34866702c5c" containerName="oc" Mar 19 20:06:07 crc kubenswrapper[5033]: I0319 20:06:07.990313 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6483248d-cb85-48f6-a317-d34866702c5c" containerName="oc" Mar 19 20:06:07 crc kubenswrapper[5033]: I0319 20:06:07.992383 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.014593 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.137380 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.137424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49kk\" (UniqueName: \"kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.137491 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.239219 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.239425 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.239457 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49kk\" (UniqueName: \"kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.240108 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.240170 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.263070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49kk\" (UniqueName: \"kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk\") pod \"redhat-marketplace-bnbn5\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.327685 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:08 crc kubenswrapper[5033]: I0319 20:06:08.826232 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:09 crc kubenswrapper[5033]: I0319 20:06:09.615954 5033 generic.go:334] "Generic (PLEG): container finished" podID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerID="5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b" exitCode=0 Mar 19 20:06:09 crc kubenswrapper[5033]: I0319 20:06:09.616039 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerDied","Data":"5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b"} Mar 19 20:06:09 crc kubenswrapper[5033]: I0319 20:06:09.616354 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerStarted","Data":"ece68475da020f623e5461fe7e86b205f5af6dcf622d61941f1cc51c4ab09fb2"} Mar 19 20:06:10 crc kubenswrapper[5033]: I0319 20:06:10.634333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerStarted","Data":"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6"} Mar 19 20:06:11 crc kubenswrapper[5033]: I0319 20:06:11.654861 5033 generic.go:334] "Generic (PLEG): container finished" podID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerID="5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6" exitCode=0 Mar 19 20:06:11 crc kubenswrapper[5033]: I0319 20:06:11.654948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerDied","Data":"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6"} Mar 19 20:06:12 crc kubenswrapper[5033]: I0319 20:06:12.622370 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:06:12 crc kubenswrapper[5033]: E0319 20:06:12.623623 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:06:12 crc kubenswrapper[5033]: I0319 20:06:12.671229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerStarted","Data":"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e"} Mar 19 20:06:12 crc kubenswrapper[5033]: I0319 20:06:12.706905 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnbn5" podStartSLOduration=3.261469087 podStartE2EDuration="5.706878356s" podCreationTimestamp="2026-03-19 20:06:07 +0000 UTC" firstStartedPulling="2026-03-19 20:06:09.618713449 +0000 UTC m=+4179.723743318" lastFinishedPulling="2026-03-19 20:06:12.064122718 +0000 UTC m=+4182.169152587" observedRunningTime="2026-03-19 20:06:12.694238699 +0000 UTC m=+4182.799268588" watchObservedRunningTime="2026-03-19 20:06:12.706878356 +0000 UTC m=+4182.811908215" Mar 19 20:06:18 crc kubenswrapper[5033]: I0319 20:06:18.327912 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:18 crc kubenswrapper[5033]: I0319 20:06:18.328422 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:18 crc kubenswrapper[5033]: I0319 20:06:18.654039 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:18 crc kubenswrapper[5033]: I0319 20:06:18.772426 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:18 crc kubenswrapper[5033]: I0319 20:06:18.888321 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:19 crc kubenswrapper[5033]: I0319 20:06:19.315378 5033 scope.go:117] "RemoveContainer" containerID="29b44fc8ea46a35a85cf617b715fc46ec4de743c78cdecefd553a69642fd7c1c" Mar 19 20:06:20 crc kubenswrapper[5033]: I0319 20:06:20.747554 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnbn5" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="registry-server" containerID="cri-o://99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e" gracePeriod=2 Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.338572 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.454347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities\") pod \"02e73b86-ca0f-44d7-96a6-934943bfa74c\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.454853 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content\") pod \"02e73b86-ca0f-44d7-96a6-934943bfa74c\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.455064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49kk\" (UniqueName: \"kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk\") pod \"02e73b86-ca0f-44d7-96a6-934943bfa74c\" (UID: \"02e73b86-ca0f-44d7-96a6-934943bfa74c\") " Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.456064 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities" (OuterVolumeSpecName: "utilities") pod "02e73b86-ca0f-44d7-96a6-934943bfa74c" (UID: "02e73b86-ca0f-44d7-96a6-934943bfa74c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.463154 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk" (OuterVolumeSpecName: "kube-api-access-j49kk") pod "02e73b86-ca0f-44d7-96a6-934943bfa74c" (UID: "02e73b86-ca0f-44d7-96a6-934943bfa74c"). InnerVolumeSpecName "kube-api-access-j49kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.489866 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02e73b86-ca0f-44d7-96a6-934943bfa74c" (UID: "02e73b86-ca0f-44d7-96a6-934943bfa74c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.559806 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49kk\" (UniqueName: \"kubernetes.io/projected/02e73b86-ca0f-44d7-96a6-934943bfa74c-kube-api-access-j49kk\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.559884 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.559907 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02e73b86-ca0f-44d7-96a6-934943bfa74c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.761736 5033 generic.go:334] "Generic (PLEG): container finished" podID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerID="99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e" exitCode=0 Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.761793 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerDied","Data":"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e"} Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.761803 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbn5" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.761836 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbn5" event={"ID":"02e73b86-ca0f-44d7-96a6-934943bfa74c","Type":"ContainerDied","Data":"ece68475da020f623e5461fe7e86b205f5af6dcf622d61941f1cc51c4ab09fb2"} Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.761856 5033 scope.go:117] "RemoveContainer" containerID="99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.784083 5033 scope.go:117] "RemoveContainer" containerID="5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.799518 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.809695 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbn5"] Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.823316 5033 scope.go:117] "RemoveContainer" containerID="5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.852615 5033 scope.go:117] "RemoveContainer" containerID="99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e" Mar 19 20:06:21 crc kubenswrapper[5033]: E0319 20:06:21.853378 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e\": container with ID starting with 99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e not found: ID does not exist" containerID="99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.853422 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e"} err="failed to get container status \"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e\": rpc error: code = NotFound desc = could not find container \"99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e\": container with ID starting with 99fc7624ffeb2179a2f5746b310565e449d25daaae291e6d2f807abc2f036e0e not found: ID does not exist" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.853469 5033 scope.go:117] "RemoveContainer" containerID="5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6" Mar 19 20:06:21 crc kubenswrapper[5033]: E0319 20:06:21.853934 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6\": container with ID starting with 5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6 not found: ID does not exist" containerID="5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.854003 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6"} err="failed to get container status \"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6\": rpc error: code = NotFound desc = could not find container \"5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6\": container with ID starting with 5d69e88be724b08025b1b07b3016d997711e15eceb2ec79a3d95e8f7e3e558e6 not found: ID does not exist" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.854050 5033 scope.go:117] "RemoveContainer" containerID="5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b" Mar 19 20:06:21 crc kubenswrapper[5033]: E0319 20:06:21.854417 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b\": container with ID starting with 5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b not found: ID does not exist" containerID="5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b" Mar 19 20:06:21 crc kubenswrapper[5033]: I0319 20:06:21.854472 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b"} err="failed to get container status \"5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b\": rpc error: code = NotFound desc = could not find container \"5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b\": container with ID starting with 5c718948d4738370878617bfda816d636d251ac5193109ff5ed205fe943f0e9b not found: ID does not exist" Mar 19 20:06:22 crc kubenswrapper[5033]: I0319 20:06:22.638148 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" path="/var/lib/kubelet/pods/02e73b86-ca0f-44d7-96a6-934943bfa74c/volumes" Mar 19 20:06:24 crc kubenswrapper[5033]: I0319 20:06:24.621566 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:06:24 crc kubenswrapper[5033]: E0319 20:06:24.622404 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.396166 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:30 crc kubenswrapper[5033]: E0319 20:06:30.397281 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="extract-content" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.397293 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="extract-content" Mar 19 20:06:30 crc kubenswrapper[5033]: E0319 20:06:30.397320 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="registry-server" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.397327 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="registry-server" Mar 19 20:06:30 crc kubenswrapper[5033]: E0319 20:06:30.397360 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="extract-utilities" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.397367 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="extract-utilities" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.397591 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e73b86-ca0f-44d7-96a6-934943bfa74c" containerName="registry-server" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.399310 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.415030 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.492895 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wwt\" (UniqueName: \"kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.492999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.493037 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.594523 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.594581 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.594693 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wwt\" (UniqueName: \"kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.595099 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.595134 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:30 crc kubenswrapper[5033]: I0319 20:06:30.791340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wwt\" (UniqueName: \"kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt\") pod \"certified-operators-2fvnm\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:31 crc kubenswrapper[5033]: I0319 20:06:31.021776 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:31 crc kubenswrapper[5033]: I0319 20:06:31.562878 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:31 crc kubenswrapper[5033]: I0319 20:06:31.872397 5033 generic.go:334] "Generic (PLEG): container finished" podID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerID="8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc" exitCode=0 Mar 19 20:06:31 crc kubenswrapper[5033]: I0319 20:06:31.872561 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerDied","Data":"8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc"} Mar 19 20:06:31 crc kubenswrapper[5033]: I0319 20:06:31.873092 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerStarted","Data":"423d500a472e7032b383353de218818e0cb435ac8703635c92f0a9ee812b2ff0"} Mar 19 20:06:32 crc kubenswrapper[5033]: I0319 20:06:32.898916 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerStarted","Data":"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf"} Mar 19 20:06:33 crc kubenswrapper[5033]: I0319 20:06:33.913934 5033 generic.go:334] "Generic (PLEG): container finished" podID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerID="bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf" exitCode=0 Mar 19 20:06:33 crc kubenswrapper[5033]: I0319 20:06:33.914096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerDied","Data":"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf"} Mar 19 20:06:34 crc kubenswrapper[5033]: I0319 20:06:34.928378 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerStarted","Data":"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081"} Mar 19 20:06:34 crc kubenswrapper[5033]: I0319 20:06:34.953550 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2fvnm" podStartSLOduration=2.461912118 podStartE2EDuration="4.953437188s" podCreationTimestamp="2026-03-19 20:06:30 +0000 UTC" firstStartedPulling="2026-03-19 20:06:31.874404838 +0000 UTC m=+4201.979434687" lastFinishedPulling="2026-03-19 20:06:34.365929908 +0000 UTC m=+4204.470959757" observedRunningTime="2026-03-19 20:06:34.946883153 +0000 UTC m=+4205.051913022" watchObservedRunningTime="2026-03-19 20:06:34.953437188 +0000 UTC m=+4205.058467077" Mar 19 20:06:39 crc kubenswrapper[5033]: I0319 20:06:39.621812 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:06:39 crc kubenswrapper[5033]: E0319 20:06:39.623227 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:06:41 crc kubenswrapper[5033]: I0319 20:06:41.022167 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:41 crc kubenswrapper[5033]: I0319 20:06:41.023104 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:41 crc kubenswrapper[5033]: I0319 20:06:41.078834 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:42 crc kubenswrapper[5033]: I0319 20:06:42.052825 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:42 crc kubenswrapper[5033]: I0319 20:06:42.129574 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.020702 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2fvnm" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="registry-server" containerID="cri-o://fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081" gracePeriod=2 Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.543723 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.719577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities\") pod \"853870d0-0afc-4f70-908b-c07c78d3fa8a\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.720299 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5wwt\" (UniqueName: \"kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt\") pod \"853870d0-0afc-4f70-908b-c07c78d3fa8a\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.720512 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content\") pod \"853870d0-0afc-4f70-908b-c07c78d3fa8a\" (UID: \"853870d0-0afc-4f70-908b-c07c78d3fa8a\") " Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.720819 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities" (OuterVolumeSpecName: "utilities") pod "853870d0-0afc-4f70-908b-c07c78d3fa8a" (UID: "853870d0-0afc-4f70-908b-c07c78d3fa8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.721375 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.738913 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt" (OuterVolumeSpecName: "kube-api-access-j5wwt") pod "853870d0-0afc-4f70-908b-c07c78d3fa8a" (UID: "853870d0-0afc-4f70-908b-c07c78d3fa8a"). InnerVolumeSpecName "kube-api-access-j5wwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.786977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "853870d0-0afc-4f70-908b-c07c78d3fa8a" (UID: "853870d0-0afc-4f70-908b-c07c78d3fa8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.824660 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5wwt\" (UniqueName: \"kubernetes.io/projected/853870d0-0afc-4f70-908b-c07c78d3fa8a-kube-api-access-j5wwt\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:44 crc kubenswrapper[5033]: I0319 20:06:44.825386 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853870d0-0afc-4f70-908b-c07c78d3fa8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.040248 5033 generic.go:334] "Generic (PLEG): container finished" podID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerID="fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081" exitCode=0 Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.041466 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerDied","Data":"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081"} Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.041572 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2fvnm" event={"ID":"853870d0-0afc-4f70-908b-c07c78d3fa8a","Type":"ContainerDied","Data":"423d500a472e7032b383353de218818e0cb435ac8703635c92f0a9ee812b2ff0"} Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.041647 5033 scope.go:117] "RemoveContainer" containerID="fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.041819 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2fvnm" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.068941 5033 scope.go:117] "RemoveContainer" containerID="bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.075109 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.084356 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2fvnm"] Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.110418 5033 scope.go:117] "RemoveContainer" containerID="8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.140222 5033 scope.go:117] "RemoveContainer" containerID="fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081" Mar 19 20:06:45 crc kubenswrapper[5033]: E0319 20:06:45.140747 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081\": container with ID starting with fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081 not found: ID does not exist" containerID="fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.140805 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081"} err="failed to get container status \"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081\": rpc error: code = NotFound desc = could not find container \"fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081\": container with ID starting with fdb5c7f3e01cb9dc1a354ef9241c68ee017ce3a658d8fdc35b93ca9d84bc5081 not found: ID does not exist" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.140844 5033 scope.go:117] "RemoveContainer" containerID="bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf" Mar 19 20:06:45 crc kubenswrapper[5033]: E0319 20:06:45.141351 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf\": container with ID starting with bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf not found: ID does not exist" containerID="bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.141381 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf"} err="failed to get container status \"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf\": rpc error: code = NotFound desc = could not find container \"bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf\": container with ID starting with bff1623e43b92479a69774409aebac7362dfbf5b2b629c70f0ec873cd66e26bf not found: ID does not exist" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.141402 5033 scope.go:117] "RemoveContainer" containerID="8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc" Mar 19 20:06:45 crc kubenswrapper[5033]: E0319 20:06:45.141726 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc\": container with ID starting with 8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc not found: ID does not exist" containerID="8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc" Mar 19 20:06:45 crc kubenswrapper[5033]: I0319 20:06:45.141746 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc"} err="failed to get container status \"8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc\": rpc error: code = NotFound desc = could not find container \"8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc\": container with ID starting with 8a46224204ffe66c5e9148b7a4456c3d54b6fe8459fa28a2263ce6ef9e2db3bc not found: ID does not exist" Mar 19 20:06:46 crc kubenswrapper[5033]: I0319 20:06:46.633185 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" path="/var/lib/kubelet/pods/853870d0-0afc-4f70-908b-c07c78d3fa8a/volumes" Mar 19 20:06:51 crc kubenswrapper[5033]: I0319 20:06:51.620644 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:06:53 crc kubenswrapper[5033]: I0319 20:06:53.133279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942"} Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.408391 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:07:06 crc kubenswrapper[5033]: E0319 20:07:06.409557 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="extract-content" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.409574 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="extract-content" Mar 19 20:07:06 crc kubenswrapper[5033]: E0319 20:07:06.409594 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="registry-server" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.409600 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="registry-server" Mar 19 20:07:06 crc kubenswrapper[5033]: E0319 20:07:06.409637 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="extract-utilities" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.409643 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="extract-utilities" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.409835 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="853870d0-0afc-4f70-908b-c07c78d3fa8a" containerName="registry-server" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.410635 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.417983 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.418051 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w5vr6" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.418262 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.418982 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.427142 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.586954 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7tg\" (UniqueName: \"kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587076 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587121 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587150 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587296 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587355 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.587654 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.689737 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.689831 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.689882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.689969 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7tg\" (UniqueName: \"kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690012 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690092 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690119 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690204 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.690883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.691082 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.691789 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.693714 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.699218 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.699428 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.707653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.709352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7tg\" (UniqueName: \"kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:06 crc kubenswrapper[5033]: I0319 20:07:06.745321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " pod="openstack/tempest-tests-tempest" Mar 19 20:07:07 crc kubenswrapper[5033]: I0319 20:07:07.033344 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:07:07 crc kubenswrapper[5033]: I0319 20:07:07.473414 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:07:08 crc kubenswrapper[5033]: I0319 20:07:08.291221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"190e5876-b872-4c45-a860-696c0e739f2b","Type":"ContainerStarted","Data":"b36e66eb4d731ef3d0438a445312e69db4bc4274750dd1812d2873ccde6ab16f"} Mar 19 20:07:35 crc kubenswrapper[5033]: E0319 20:07:35.352498 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 20:07:35 crc kubenswrapper[5033]: E0319 20:07:35.353654 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kq7tg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(190e5876-b872-4c45-a860-696c0e739f2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:07:35 crc kubenswrapper[5033]: E0319 20:07:35.355717 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="190e5876-b872-4c45-a860-696c0e739f2b" Mar 19 20:07:35 crc kubenswrapper[5033]: E0319 20:07:35.580052 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="190e5876-b872-4c45-a860-696c0e739f2b" Mar 19 20:07:50 crc kubenswrapper[5033]: I0319 20:07:50.150085 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:07:51 crc kubenswrapper[5033]: I0319 20:07:51.757254 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"190e5876-b872-4c45-a860-696c0e739f2b","Type":"ContainerStarted","Data":"7cd1d462bf1cb337bded18305121392df9c0bd61d7501e7cb5fb732fb81bce2f"} Mar 19 20:07:51 crc kubenswrapper[5033]: I0319 20:07:51.784237 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.110289675 podStartE2EDuration="46.78420823s" podCreationTimestamp="2026-03-19 20:07:05 +0000 UTC" firstStartedPulling="2026-03-19 20:07:07.47368793 +0000 UTC m=+4237.578717779" lastFinishedPulling="2026-03-19 20:07:50.147606495 +0000 UTC m=+4280.252636334" observedRunningTime="2026-03-19 20:07:51.778675224 +0000 UTC m=+4281.883705093" watchObservedRunningTime="2026-03-19 20:07:51.78420823 +0000 UTC m=+4281.889238079" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.153887 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565848-f5t2h"] Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.155734 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.157730 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.157734 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.158936 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.178833 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-f5t2h"] Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.252435 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c8sz\" (UniqueName: \"kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz\") pod \"auto-csr-approver-29565848-f5t2h\" (UID: \"6a20f917-0f71-4350-9000-c676b2f640be\") " pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.355121 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c8sz\" (UniqueName: \"kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz\") pod \"auto-csr-approver-29565848-f5t2h\" (UID: \"6a20f917-0f71-4350-9000-c676b2f640be\") " pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.380519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c8sz\" (UniqueName: \"kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz\") pod \"auto-csr-approver-29565848-f5t2h\" (UID: \"6a20f917-0f71-4350-9000-c676b2f640be\") " pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.474751 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:00 crc kubenswrapper[5033]: I0319 20:08:00.954822 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-f5t2h"] Mar 19 20:08:01 crc kubenswrapper[5033]: I0319 20:08:01.863762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" event={"ID":"6a20f917-0f71-4350-9000-c676b2f640be","Type":"ContainerStarted","Data":"0e6d762bf87e0626f0aefefc8becca08b0e096cb500222f8bbeabe9c9df15ff3"} Mar 19 20:08:03 crc kubenswrapper[5033]: I0319 20:08:03.884556 5033 generic.go:334] "Generic (PLEG): container finished" podID="6a20f917-0f71-4350-9000-c676b2f640be" containerID="dcac3d1e6787ab430a948b897c2b87d1e780e7b80709ea28b1e367b6b1e4d480" exitCode=0 Mar 19 20:08:03 crc kubenswrapper[5033]: I0319 20:08:03.884611 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" event={"ID":"6a20f917-0f71-4350-9000-c676b2f640be","Type":"ContainerDied","Data":"dcac3d1e6787ab430a948b897c2b87d1e780e7b80709ea28b1e367b6b1e4d480"} Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.355244 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.460679 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c8sz\" (UniqueName: \"kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz\") pod \"6a20f917-0f71-4350-9000-c676b2f640be\" (UID: \"6a20f917-0f71-4350-9000-c676b2f640be\") " Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.470072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz" (OuterVolumeSpecName: "kube-api-access-5c8sz") pod "6a20f917-0f71-4350-9000-c676b2f640be" (UID: "6a20f917-0f71-4350-9000-c676b2f640be"). InnerVolumeSpecName "kube-api-access-5c8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.563264 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c8sz\" (UniqueName: \"kubernetes.io/projected/6a20f917-0f71-4350-9000-c676b2f640be-kube-api-access-5c8sz\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.911652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" event={"ID":"6a20f917-0f71-4350-9000-c676b2f640be","Type":"ContainerDied","Data":"0e6d762bf87e0626f0aefefc8becca08b0e096cb500222f8bbeabe9c9df15ff3"} Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.911703 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6d762bf87e0626f0aefefc8becca08b0e096cb500222f8bbeabe9c9df15ff3" Mar 19 20:08:05 crc kubenswrapper[5033]: I0319 20:08:05.911773 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-f5t2h" Mar 19 20:08:06 crc kubenswrapper[5033]: I0319 20:08:06.448259 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-sb8sk"] Mar 19 20:08:06 crc kubenswrapper[5033]: I0319 20:08:06.456640 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-sb8sk"] Mar 19 20:08:06 crc kubenswrapper[5033]: I0319 20:08:06.630736 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c653c9-a189-4a7d-a694-4c53306c0c4e" path="/var/lib/kubelet/pods/97c653c9-a189-4a7d-a694-4c53306c0c4e/volumes" Mar 19 20:08:35 crc kubenswrapper[5033]: I0319 20:08:35.297627 5033 scope.go:117] "RemoveContainer" containerID="3c8d521f51d724190ba9d91ba168ba3b48c0de14adea2d4738d5b050e348ca13" Mar 19 20:09:10 crc kubenswrapper[5033]: I0319 20:09:10.758931 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:09:10 crc kubenswrapper[5033]: I0319 20:09:10.759559 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:09:40 crc kubenswrapper[5033]: I0319 20:09:40.758998 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:09:40 crc kubenswrapper[5033]: I0319 20:09:40.759607 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.144425 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565850-wdtdf"] Mar 19 20:10:00 crc kubenswrapper[5033]: E0319 20:10:00.145357 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a20f917-0f71-4350-9000-c676b2f640be" containerName="oc" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.145369 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a20f917-0f71-4350-9000-c676b2f640be" containerName="oc" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.145659 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a20f917-0f71-4350-9000-c676b2f640be" containerName="oc" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.146399 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.153463 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.153649 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.153793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.161545 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-wdtdf"] Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.208947 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fxg\" (UniqueName: \"kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg\") pod \"auto-csr-approver-29565850-wdtdf\" (UID: \"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99\") " pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.311309 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fxg\" (UniqueName: \"kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg\") pod \"auto-csr-approver-29565850-wdtdf\" (UID: \"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99\") " pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.333943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fxg\" (UniqueName: \"kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg\") pod \"auto-csr-approver-29565850-wdtdf\" (UID: \"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99\") " pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.471685 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:00 crc kubenswrapper[5033]: I0319 20:10:00.980617 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-wdtdf"] Mar 19 20:10:01 crc kubenswrapper[5033]: I0319 20:10:01.077160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" event={"ID":"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99","Type":"ContainerStarted","Data":"0370370ee0bcf73aa5d17e7c8a4985a08c201bdde0d9c8302fec88e32a571b55"} Mar 19 20:10:03 crc kubenswrapper[5033]: I0319 20:10:03.099852 5033 generic.go:334] "Generic (PLEG): container finished" podID="7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" containerID="c1e14358f99c9d7203c4566b26e6758d7c77501352f7cec22f032280bf649424" exitCode=0 Mar 19 20:10:03 crc kubenswrapper[5033]: I0319 20:10:03.099929 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" event={"ID":"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99","Type":"ContainerDied","Data":"c1e14358f99c9d7203c4566b26e6758d7c77501352f7cec22f032280bf649424"} Mar 19 20:10:04 crc kubenswrapper[5033]: I0319 20:10:04.713156 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:04 crc kubenswrapper[5033]: I0319 20:10:04.908766 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7fxg\" (UniqueName: \"kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg\") pod \"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99\" (UID: \"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99\") " Mar 19 20:10:04 crc kubenswrapper[5033]: I0319 20:10:04.914661 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg" (OuterVolumeSpecName: "kube-api-access-v7fxg") pod "7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" (UID: "7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99"). InnerVolumeSpecName "kube-api-access-v7fxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.011909 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7fxg\" (UniqueName: \"kubernetes.io/projected/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99-kube-api-access-v7fxg\") on node \"crc\" DevicePath \"\"" Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.119631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" event={"ID":"7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99","Type":"ContainerDied","Data":"0370370ee0bcf73aa5d17e7c8a4985a08c201bdde0d9c8302fec88e32a571b55"} Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.119678 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0370370ee0bcf73aa5d17e7c8a4985a08c201bdde0d9c8302fec88e32a571b55" Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.119698 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-wdtdf" Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.801396 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-n2r2j"] Mar 19 20:10:05 crc kubenswrapper[5033]: I0319 20:10:05.814954 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-n2r2j"] Mar 19 20:10:06 crc kubenswrapper[5033]: I0319 20:10:06.633220 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ddc6918-5028-4cfe-bc1e-7e92def5be70" path="/var/lib/kubelet/pods/4ddc6918-5028-4cfe-bc1e-7e92def5be70/volumes" Mar 19 20:10:10 crc kubenswrapper[5033]: I0319 20:10:10.758721 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:10:10 crc kubenswrapper[5033]: I0319 20:10:10.759143 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:10:10 crc kubenswrapper[5033]: I0319 20:10:10.759179 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 20:10:10 crc kubenswrapper[5033]: I0319 20:10:10.759713 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:10:10 crc kubenswrapper[5033]: I0319 20:10:10.759759 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942" gracePeriod=600 Mar 19 20:10:11 crc kubenswrapper[5033]: I0319 20:10:11.187661 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942" exitCode=0 Mar 19 20:10:11 crc kubenswrapper[5033]: I0319 20:10:11.187758 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942"} Mar 19 20:10:11 crc kubenswrapper[5033]: I0319 20:10:11.188004 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe"} Mar 19 20:10:11 crc kubenswrapper[5033]: I0319 20:10:11.188029 5033 scope.go:117] "RemoveContainer" containerID="5616ef744c232ebc7a5520880fa77592731577dde5985201edd04be5f84249e1" Mar 19 20:10:35 crc kubenswrapper[5033]: I0319 20:10:35.433325 5033 scope.go:117] "RemoveContainer" containerID="6880cfaf3c72c45e4f1d52f107d3f1b9980ec06d019880299ff072455731d042" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.406513 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fkdb8"] Mar 19 20:11:49 crc kubenswrapper[5033]: E0319 20:11:49.407627 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" containerName="oc" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.407644 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" containerName="oc" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.407896 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" containerName="oc" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.409854 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.419184 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkdb8"] Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.584063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-catalog-content\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.584463 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-utilities\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.584614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48zzn\" (UniqueName: \"kubernetes.io/projected/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-kube-api-access-48zzn\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.686852 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-catalog-content\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.686948 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-utilities\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.687026 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48zzn\" (UniqueName: \"kubernetes.io/projected/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-kube-api-access-48zzn\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.687676 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-catalog-content\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:49 crc kubenswrapper[5033]: I0319 20:11:49.687748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-utilities\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:50 crc kubenswrapper[5033]: I0319 20:11:50.192680 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48zzn\" (UniqueName: \"kubernetes.io/projected/fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152-kube-api-access-48zzn\") pod \"community-operators-fkdb8\" (UID: \"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152\") " pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:50 crc kubenswrapper[5033]: I0319 20:11:50.335547 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:11:51 crc kubenswrapper[5033]: I0319 20:11:51.129354 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkdb8"] Mar 19 20:11:51 crc kubenswrapper[5033]: I0319 20:11:51.212973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkdb8" event={"ID":"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152","Type":"ContainerStarted","Data":"e691e2bcadf32f6f66adb0bbdc2b246629443bcc923a567a38e9c8be18e30704"} Mar 19 20:11:52 crc kubenswrapper[5033]: I0319 20:11:52.223385 5033 generic.go:334] "Generic (PLEG): container finished" podID="fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152" containerID="54d6fec5fbaeb0a7c6c16b93e2859f5551dff8bdc6dbc868c89d2d4d182b76f7" exitCode=0 Mar 19 20:11:52 crc kubenswrapper[5033]: I0319 20:11:52.223489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkdb8" event={"ID":"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152","Type":"ContainerDied","Data":"54d6fec5fbaeb0a7c6c16b93e2859f5551dff8bdc6dbc868c89d2d4d182b76f7"} Mar 19 20:11:52 crc kubenswrapper[5033]: I0319 20:11:52.226345 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.148826 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565852-rbwmt"] Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.150975 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.155745 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.155896 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.155970 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.165073 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46bs\" (UniqueName: \"kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs\") pod \"auto-csr-approver-29565852-rbwmt\" (UID: \"4606ed7c-b667-43a1-93e8-f145b487594a\") " pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.168598 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-rbwmt"] Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.267303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46bs\" (UniqueName: \"kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs\") pod \"auto-csr-approver-29565852-rbwmt\" (UID: \"4606ed7c-b667-43a1-93e8-f145b487594a\") " pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.493802 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46bs\" (UniqueName: \"kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs\") pod \"auto-csr-approver-29565852-rbwmt\" (UID: \"4606ed7c-b667-43a1-93e8-f145b487594a\") " pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:00 crc kubenswrapper[5033]: I0319 20:12:00.775386 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:03 crc kubenswrapper[5033]: I0319 20:12:03.769141 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-rbwmt"] Mar 19 20:12:04 crc kubenswrapper[5033]: I0319 20:12:04.348899 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" event={"ID":"4606ed7c-b667-43a1-93e8-f145b487594a","Type":"ContainerStarted","Data":"7aaf586912765fed56db9c617c3e4d9b4281a61dc8278982936dfe3648566a19"} Mar 19 20:12:04 crc kubenswrapper[5033]: I0319 20:12:04.351498 5033 generic.go:334] "Generic (PLEG): container finished" podID="fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152" containerID="bb71d2c45bf7f8228c2109d5af3f04559af624071b6810ea24398e30569502d0" exitCode=0 Mar 19 20:12:04 crc kubenswrapper[5033]: I0319 20:12:04.351540 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkdb8" event={"ID":"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152","Type":"ContainerDied","Data":"bb71d2c45bf7f8228c2109d5af3f04559af624071b6810ea24398e30569502d0"} Mar 19 20:12:05 crc kubenswrapper[5033]: I0319 20:12:05.363756 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkdb8" event={"ID":"fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152","Type":"ContainerStarted","Data":"7e55299d78f699d4b5e5bf284e7a70cc324a582ba97bbc75925239c5329d3e70"} Mar 19 20:12:05 crc kubenswrapper[5033]: I0319 20:12:05.438013 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fkdb8" podStartSLOduration=3.85281571 podStartE2EDuration="16.43799345s" podCreationTimestamp="2026-03-19 20:11:49 +0000 UTC" firstStartedPulling="2026-03-19 20:11:52.226071873 +0000 UTC m=+4522.331101722" lastFinishedPulling="2026-03-19 20:12:04.811249603 +0000 UTC m=+4534.916279462" observedRunningTime="2026-03-19 20:12:05.397991971 +0000 UTC m=+4535.503021840" watchObservedRunningTime="2026-03-19 20:12:05.43799345 +0000 UTC m=+4535.543023299" Mar 19 20:12:06 crc kubenswrapper[5033]: I0319 20:12:06.376376 5033 generic.go:334] "Generic (PLEG): container finished" podID="4606ed7c-b667-43a1-93e8-f145b487594a" containerID="99f480714ebd2ae3e4b0405adf6771e07bcce962fd477a5c7c5b15c2c23869d4" exitCode=0 Mar 19 20:12:06 crc kubenswrapper[5033]: I0319 20:12:06.376442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" event={"ID":"4606ed7c-b667-43a1-93e8-f145b487594a","Type":"ContainerDied","Data":"99f480714ebd2ae3e4b0405adf6771e07bcce962fd477a5c7c5b15c2c23869d4"} Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.397961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" event={"ID":"4606ed7c-b667-43a1-93e8-f145b487594a","Type":"ContainerDied","Data":"7aaf586912765fed56db9c617c3e4d9b4281a61dc8278982936dfe3648566a19"} Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.398387 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aaf586912765fed56db9c617c3e4d9b4281a61dc8278982936dfe3648566a19" Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.427965 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.571156 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46bs\" (UniqueName: \"kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs\") pod \"4606ed7c-b667-43a1-93e8-f145b487594a\" (UID: \"4606ed7c-b667-43a1-93e8-f145b487594a\") " Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.581763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs" (OuterVolumeSpecName: "kube-api-access-g46bs") pod "4606ed7c-b667-43a1-93e8-f145b487594a" (UID: "4606ed7c-b667-43a1-93e8-f145b487594a"). InnerVolumeSpecName "kube-api-access-g46bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:08 crc kubenswrapper[5033]: I0319 20:12:08.674407 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46bs\" (UniqueName: \"kubernetes.io/projected/4606ed7c-b667-43a1-93e8-f145b487594a-kube-api-access-g46bs\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:09 crc kubenswrapper[5033]: I0319 20:12:09.407134 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-rbwmt" Mar 19 20:12:09 crc kubenswrapper[5033]: I0319 20:12:09.522117 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-ks689"] Mar 19 20:12:09 crc kubenswrapper[5033]: I0319 20:12:09.532329 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-ks689"] Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.336711 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.336778 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.392139 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.471851 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fkdb8" Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.674317 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6483248d-cb85-48f6-a317-d34866702c5c" path="/var/lib/kubelet/pods/6483248d-cb85-48f6-a317-d34866702c5c/volumes" Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.681180 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkdb8"] Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.730905 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 20:12:10 crc kubenswrapper[5033]: I0319 20:12:10.731135 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mrg7t" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="registry-server" containerID="cri-o://9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" gracePeriod=2 Mar 19 20:12:11 crc kubenswrapper[5033]: E0319 20:12:11.383925 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1 is running failed: container process not found" containerID="9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:12:11 crc kubenswrapper[5033]: E0319 20:12:11.384626 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1 is running failed: container process not found" containerID="9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:12:11 crc kubenswrapper[5033]: E0319 20:12:11.385058 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1 is running failed: container process not found" containerID="9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:12:11 crc kubenswrapper[5033]: E0319 20:12:11.385087 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-mrg7t" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="registry-server" Mar 19 20:12:11 crc kubenswrapper[5033]: I0319 20:12:11.426058 5033 generic.go:334] "Generic (PLEG): container finished" podID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerID="9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" exitCode=0 Mar 19 20:12:11 crc kubenswrapper[5033]: I0319 20:12:11.426438 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerDied","Data":"9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1"} Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.006823 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.074055 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcncm\" (UniqueName: \"kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm\") pod \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.074237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities\") pod \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.074316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content\") pod \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\" (UID: \"63ec342c-bcf6-4ec4-838b-f0af17eb5aac\") " Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.074866 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities" (OuterVolumeSpecName: "utilities") pod "63ec342c-bcf6-4ec4-838b-f0af17eb5aac" (UID: "63ec342c-bcf6-4ec4-838b-f0af17eb5aac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.095736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm" (OuterVolumeSpecName: "kube-api-access-dcncm") pod "63ec342c-bcf6-4ec4-838b-f0af17eb5aac" (UID: "63ec342c-bcf6-4ec4-838b-f0af17eb5aac"). InnerVolumeSpecName "kube-api-access-dcncm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.166363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63ec342c-bcf6-4ec4-838b-f0af17eb5aac" (UID: "63ec342c-bcf6-4ec4-838b-f0af17eb5aac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.178409 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcncm\" (UniqueName: \"kubernetes.io/projected/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-kube-api-access-dcncm\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.178712 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.178812 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63ec342c-bcf6-4ec4-838b-f0af17eb5aac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.437001 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrg7t" event={"ID":"63ec342c-bcf6-4ec4-838b-f0af17eb5aac","Type":"ContainerDied","Data":"59543798e46a3b207e760d13bc5d5eaf2989e1700b49d394e04084f0adbdff0f"} Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.437083 5033 scope.go:117] "RemoveContainer" containerID="9f9fd8e3974d347b24262d9c1f3d699361370674d514181044604fb9d8d509e1" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.437035 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrg7t" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.462745 5033 scope.go:117] "RemoveContainer" containerID="3802fac55be63f703d118c253d75d76a1b8c1e910b72baea680f5548baaa212f" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.488064 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.495350 5033 scope.go:117] "RemoveContainer" containerID="46894d7a23efff17aeb7836e73004de72438e6bf40b599ddc766813e7324691a" Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.510248 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mrg7t"] Mar 19 20:12:12 crc kubenswrapper[5033]: I0319 20:12:12.634735 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" path="/var/lib/kubelet/pods/63ec342c-bcf6-4ec4-838b-f0af17eb5aac/volumes" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.851191 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:12:32 crc kubenswrapper[5033]: E0319 20:12:32.852946 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4606ed7c-b667-43a1-93e8-f145b487594a" containerName="oc" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853030 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4606ed7c-b667-43a1-93e8-f145b487594a" containerName="oc" Mar 19 20:12:32 crc kubenswrapper[5033]: E0319 20:12:32.853126 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="extract-content" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853183 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="extract-content" Mar 19 20:12:32 crc kubenswrapper[5033]: E0319 20:12:32.853248 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="registry-server" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853327 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="registry-server" Mar 19 20:12:32 crc kubenswrapper[5033]: E0319 20:12:32.853395 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="extract-utilities" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853464 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="extract-utilities" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853739 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ec342c-bcf6-4ec4-838b-f0af17eb5aac" containerName="registry-server" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.853827 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4606ed7c-b667-43a1-93e8-f145b487594a" containerName="oc" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.856951 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:32 crc kubenswrapper[5033]: I0319 20:12:32.873608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.052335 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.052379 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.052415 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mh7p\" (UniqueName: \"kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.155110 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.155179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.155227 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mh7p\" (UniqueName: \"kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.155660 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.155734 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.177249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mh7p\" (UniqueName: \"kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p\") pod \"redhat-operators-kf8cd\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.190159 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:33 crc kubenswrapper[5033]: I0319 20:12:33.977014 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:12:33 crc kubenswrapper[5033]: W0319 20:12:33.988772 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe31c421_fc20_43a5_b16f_7034e643e887.slice/crio-bdeffd9958dc4c1dd0bcba8958f24dd7192ba9474fd9fb2b76c7ff879fa9f53e WatchSource:0}: Error finding container bdeffd9958dc4c1dd0bcba8958f24dd7192ba9474fd9fb2b76c7ff879fa9f53e: Status 404 returned error can't find the container with id bdeffd9958dc4c1dd0bcba8958f24dd7192ba9474fd9fb2b76c7ff879fa9f53e Mar 19 20:12:34 crc kubenswrapper[5033]: I0319 20:12:34.657871 5033 generic.go:334] "Generic (PLEG): container finished" podID="be31c421-fc20-43a5-b16f-7034e643e887" containerID="904dcbe0fcec20397a27249435d3286fbfe5e4a42292d80dfe2897c4310cbdea" exitCode=0 Mar 19 20:12:34 crc kubenswrapper[5033]: I0319 20:12:34.657957 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerDied","Data":"904dcbe0fcec20397a27249435d3286fbfe5e4a42292d80dfe2897c4310cbdea"} Mar 19 20:12:34 crc kubenswrapper[5033]: I0319 20:12:34.658154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerStarted","Data":"bdeffd9958dc4c1dd0bcba8958f24dd7192ba9474fd9fb2b76c7ff879fa9f53e"} Mar 19 20:12:35 crc kubenswrapper[5033]: I0319 20:12:35.556346 5033 scope.go:117] "RemoveContainer" containerID="3b6c80295b7e10f3feeea18a336dd21df48982db2cc067fc70aeedecc11d38ae" Mar 19 20:12:36 crc kubenswrapper[5033]: I0319 20:12:36.679654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerStarted","Data":"b924598a6a856b9892c77150f3245be6b69ea63562ad569951c8fb9c46f7e3bb"} Mar 19 20:12:40 crc kubenswrapper[5033]: I0319 20:12:40.759244 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:12:40 crc kubenswrapper[5033]: I0319 20:12:40.759777 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:12:42 crc kubenswrapper[5033]: I0319 20:12:42.747318 5033 generic.go:334] "Generic (PLEG): container finished" podID="be31c421-fc20-43a5-b16f-7034e643e887" containerID="b924598a6a856b9892c77150f3245be6b69ea63562ad569951c8fb9c46f7e3bb" exitCode=0 Mar 19 20:12:42 crc kubenswrapper[5033]: I0319 20:12:42.747413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerDied","Data":"b924598a6a856b9892c77150f3245be6b69ea63562ad569951c8fb9c46f7e3bb"} Mar 19 20:12:43 crc kubenswrapper[5033]: I0319 20:12:43.758557 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerStarted","Data":"d97413592ecf2289540551a89218e6fb79abe9f5b6f75542cca2f86c54071983"} Mar 19 20:12:43 crc kubenswrapper[5033]: I0319 20:12:43.792512 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kf8cd" podStartSLOduration=3.310235285 podStartE2EDuration="11.792493602s" podCreationTimestamp="2026-03-19 20:12:32 +0000 UTC" firstStartedPulling="2026-03-19 20:12:34.659676826 +0000 UTC m=+4564.764706675" lastFinishedPulling="2026-03-19 20:12:43.141935143 +0000 UTC m=+4573.246964992" observedRunningTime="2026-03-19 20:12:43.786556154 +0000 UTC m=+4573.891586013" watchObservedRunningTime="2026-03-19 20:12:43.792493602 +0000 UTC m=+4573.897523451" Mar 19 20:12:53 crc kubenswrapper[5033]: I0319 20:12:53.200433 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:53 crc kubenswrapper[5033]: I0319 20:12:53.201014 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:12:54 crc kubenswrapper[5033]: I0319 20:12:54.254863 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kf8cd" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="registry-server" probeResult="failure" output=< Mar 19 20:12:54 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 20:12:54 crc kubenswrapper[5033]: > Mar 19 20:13:03 crc kubenswrapper[5033]: I0319 20:13:03.261294 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:13:03 crc kubenswrapper[5033]: I0319 20:13:03.310655 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:13:03 crc kubenswrapper[5033]: I0319 20:13:03.498747 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:13:04 crc kubenswrapper[5033]: I0319 20:13:04.973736 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kf8cd" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="registry-server" containerID="cri-o://d97413592ecf2289540551a89218e6fb79abe9f5b6f75542cca2f86c54071983" gracePeriod=2 Mar 19 20:13:05 crc kubenswrapper[5033]: I0319 20:13:05.987973 5033 generic.go:334] "Generic (PLEG): container finished" podID="be31c421-fc20-43a5-b16f-7034e643e887" containerID="d97413592ecf2289540551a89218e6fb79abe9f5b6f75542cca2f86c54071983" exitCode=0 Mar 19 20:13:05 crc kubenswrapper[5033]: I0319 20:13:05.988066 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerDied","Data":"d97413592ecf2289540551a89218e6fb79abe9f5b6f75542cca2f86c54071983"} Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.178587 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.254703 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mh7p\" (UniqueName: \"kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p\") pod \"be31c421-fc20-43a5-b16f-7034e643e887\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.254773 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content\") pod \"be31c421-fc20-43a5-b16f-7034e643e887\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.254941 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities\") pod \"be31c421-fc20-43a5-b16f-7034e643e887\" (UID: \"be31c421-fc20-43a5-b16f-7034e643e887\") " Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.255917 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities" (OuterVolumeSpecName: "utilities") pod "be31c421-fc20-43a5-b16f-7034e643e887" (UID: "be31c421-fc20-43a5-b16f-7034e643e887"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.260354 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p" (OuterVolumeSpecName: "kube-api-access-6mh7p") pod "be31c421-fc20-43a5-b16f-7034e643e887" (UID: "be31c421-fc20-43a5-b16f-7034e643e887"). InnerVolumeSpecName "kube-api-access-6mh7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.357051 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mh7p\" (UniqueName: \"kubernetes.io/projected/be31c421-fc20-43a5-b16f-7034e643e887-kube-api-access-6mh7p\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.357079 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.406444 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be31c421-fc20-43a5-b16f-7034e643e887" (UID: "be31c421-fc20-43a5-b16f-7034e643e887"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.459333 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be31c421-fc20-43a5-b16f-7034e643e887-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.999287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf8cd" event={"ID":"be31c421-fc20-43a5-b16f-7034e643e887","Type":"ContainerDied","Data":"bdeffd9958dc4c1dd0bcba8958f24dd7192ba9474fd9fb2b76c7ff879fa9f53e"} Mar 19 20:13:06 crc kubenswrapper[5033]: I0319 20:13:06.999344 5033 scope.go:117] "RemoveContainer" containerID="d97413592ecf2289540551a89218e6fb79abe9f5b6f75542cca2f86c54071983" Mar 19 20:13:07 crc kubenswrapper[5033]: I0319 20:13:06.999399 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf8cd" Mar 19 20:13:07 crc kubenswrapper[5033]: I0319 20:13:07.023121 5033 scope.go:117] "RemoveContainer" containerID="b924598a6a856b9892c77150f3245be6b69ea63562ad569951c8fb9c46f7e3bb" Mar 19 20:13:07 crc kubenswrapper[5033]: I0319 20:13:07.038079 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:13:07 crc kubenswrapper[5033]: I0319 20:13:07.052236 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kf8cd"] Mar 19 20:13:07 crc kubenswrapper[5033]: I0319 20:13:07.054226 5033 scope.go:117] "RemoveContainer" containerID="904dcbe0fcec20397a27249435d3286fbfe5e4a42292d80dfe2897c4310cbdea" Mar 19 20:13:08 crc kubenswrapper[5033]: I0319 20:13:08.631696 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be31c421-fc20-43a5-b16f-7034e643e887" path="/var/lib/kubelet/pods/be31c421-fc20-43a5-b16f-7034e643e887/volumes" Mar 19 20:13:10 crc kubenswrapper[5033]: I0319 20:13:10.758730 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:13:10 crc kubenswrapper[5033]: I0319 20:13:10.759980 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:13:17 crc kubenswrapper[5033]: I0319 20:13:17.100524 5033 generic.go:334] "Generic (PLEG): container finished" podID="190e5876-b872-4c45-a860-696c0e739f2b" containerID="7cd1d462bf1cb337bded18305121392df9c0bd61d7501e7cb5fb732fb81bce2f" exitCode=0 Mar 19 20:13:17 crc kubenswrapper[5033]: I0319 20:13:17.101156 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"190e5876-b872-4c45-a860-696c0e739f2b","Type":"ContainerDied","Data":"7cd1d462bf1cb337bded18305121392df9c0bd61d7501e7cb5fb732fb81bce2f"} Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.332278 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444215 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444271 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444353 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444375 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444416 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444439 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq7tg\" (UniqueName: \"kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444495 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444519 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.444680 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config\") pod \"190e5876-b872-4c45-a860-696c0e739f2b\" (UID: \"190e5876-b872-4c45-a860-696c0e739f2b\") " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.446629 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data" (OuterVolumeSpecName: "config-data") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.449103 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.453712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg" (OuterVolumeSpecName: "kube-api-access-kq7tg") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "kube-api-access-kq7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.459619 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.495103 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.514638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551270 5033 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551314 5033 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551344 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551357 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551370 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq7tg\" (UniqueName: \"kubernetes.io/projected/190e5876-b872-4c45-a860-696c0e739f2b-kube-api-access-kq7tg\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.551386 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.585383 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.590804 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.606112 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.653335 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.653363 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.653375 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/190e5876-b872-4c45-a860-696c0e739f2b-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.914025 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "190e5876-b872-4c45-a860-696c0e739f2b" (UID: "190e5876-b872-4c45-a860-696c0e739f2b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:19 crc kubenswrapper[5033]: I0319 20:13:19.960250 5033 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/190e5876-b872-4c45-a860-696c0e739f2b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:20 crc kubenswrapper[5033]: I0319 20:13:20.129048 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"190e5876-b872-4c45-a860-696c0e739f2b","Type":"ContainerDied","Data":"b36e66eb4d731ef3d0438a445312e69db4bc4274750dd1812d2873ccde6ab16f"} Mar 19 20:13:20 crc kubenswrapper[5033]: I0319 20:13:20.129086 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36e66eb4d731ef3d0438a445312e69db4bc4274750dd1812d2873ccde6ab16f" Mar 19 20:13:20 crc kubenswrapper[5033]: I0319 20:13:20.129153 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.597746 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:13:24 crc kubenswrapper[5033]: E0319 20:13:24.599038 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190e5876-b872-4c45-a860-696c0e739f2b" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599057 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="190e5876-b872-4c45-a860-696c0e739f2b" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:13:24 crc kubenswrapper[5033]: E0319 20:13:24.599088 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="extract-utilities" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599097 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="extract-utilities" Mar 19 20:13:24 crc kubenswrapper[5033]: E0319 20:13:24.599125 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="registry-server" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599133 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="registry-server" Mar 19 20:13:24 crc kubenswrapper[5033]: E0319 20:13:24.599163 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="extract-content" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599172 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="extract-content" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599461 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be31c421-fc20-43a5-b16f-7034e643e887" containerName="registry-server" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.599511 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="190e5876-b872-4c45-a860-696c0e739f2b" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.600563 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.609828 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.612001 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w5vr6" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.742555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.742616 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppd9c\" (UniqueName: \"kubernetes.io/projected/c00f2e97-0513-4dd0-ad6d-851e50ba920f-kube-api-access-ppd9c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.844948 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.845004 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppd9c\" (UniqueName: \"kubernetes.io/projected/c00f2e97-0513-4dd0-ad6d-851e50ba920f-kube-api-access-ppd9c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.845609 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.876647 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppd9c\" (UniqueName: \"kubernetes.io/projected/c00f2e97-0513-4dd0-ad6d-851e50ba920f-kube-api-access-ppd9c\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.911698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c00f2e97-0513-4dd0-ad6d-851e50ba920f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:24 crc kubenswrapper[5033]: I0319 20:13:24.931119 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:13:25 crc kubenswrapper[5033]: I0319 20:13:25.470654 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:13:26 crc kubenswrapper[5033]: I0319 20:13:26.196399 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c00f2e97-0513-4dd0-ad6d-851e50ba920f","Type":"ContainerStarted","Data":"f498dc513535ef9dc122c64c6501dcf2702b34e440e3ba49f778a46b4788624d"} Mar 19 20:13:28 crc kubenswrapper[5033]: I0319 20:13:28.217126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c00f2e97-0513-4dd0-ad6d-851e50ba920f","Type":"ContainerStarted","Data":"6b76637d29c5ade40ec8e9b48eb8d106ac7643bc23b40f17b436ee5442bb0a47"} Mar 19 20:13:28 crc kubenswrapper[5033]: I0319 20:13:28.235817 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.646588473 podStartE2EDuration="4.23579536s" podCreationTimestamp="2026-03-19 20:13:24 +0000 UTC" firstStartedPulling="2026-03-19 20:13:25.487641308 +0000 UTC m=+4615.592671157" lastFinishedPulling="2026-03-19 20:13:27.076848195 +0000 UTC m=+4617.181878044" observedRunningTime="2026-03-19 20:13:28.230622964 +0000 UTC m=+4618.335652813" watchObservedRunningTime="2026-03-19 20:13:28.23579536 +0000 UTC m=+4618.340825209" Mar 19 20:13:40 crc kubenswrapper[5033]: I0319 20:13:40.759188 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:13:40 crc kubenswrapper[5033]: I0319 20:13:40.759814 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:13:40 crc kubenswrapper[5033]: I0319 20:13:40.759854 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 20:13:40 crc kubenswrapper[5033]: I0319 20:13:40.760646 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:13:40 crc kubenswrapper[5033]: I0319 20:13:40.760698 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" gracePeriod=600 Mar 19 20:13:40 crc kubenswrapper[5033]: E0319 20:13:40.882884 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:13:41 crc kubenswrapper[5033]: I0319 20:13:41.325881 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" exitCode=0 Mar 19 20:13:41 crc kubenswrapper[5033]: I0319 20:13:41.325922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe"} Mar 19 20:13:41 crc kubenswrapper[5033]: I0319 20:13:41.325960 5033 scope.go:117] "RemoveContainer" containerID="b7e2540710fc85efc18da27c84c8299328041969b9e887611438a7427e632942" Mar 19 20:13:41 crc kubenswrapper[5033]: I0319 20:13:41.326378 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:13:41 crc kubenswrapper[5033]: E0319 20:13:41.326672 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:13:52 crc kubenswrapper[5033]: I0319 20:13:52.621824 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:13:52 crc kubenswrapper[5033]: E0319 20:13:52.622871 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.144238 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565854-xlsjz"] Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.146041 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.155345 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-xlsjz"] Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.157450 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.157827 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.158233 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.234851 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tm5w\" (UniqueName: \"kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w\") pod \"auto-csr-approver-29565854-xlsjz\" (UID: \"e31ccec7-6d67-4152-8128-afca55385d33\") " pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.337110 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tm5w\" (UniqueName: \"kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w\") pod \"auto-csr-approver-29565854-xlsjz\" (UID: \"e31ccec7-6d67-4152-8128-afca55385d33\") " pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.378327 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tm5w\" (UniqueName: \"kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w\") pod \"auto-csr-approver-29565854-xlsjz\" (UID: \"e31ccec7-6d67-4152-8128-afca55385d33\") " pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:00 crc kubenswrapper[5033]: I0319 20:14:00.461928 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:01 crc kubenswrapper[5033]: I0319 20:14:01.314232 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-xlsjz"] Mar 19 20:14:01 crc kubenswrapper[5033]: I0319 20:14:01.545374 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" event={"ID":"e31ccec7-6d67-4152-8128-afca55385d33","Type":"ContainerStarted","Data":"499a40960198681b49ab337c4592fd5d45e90f0cc265003cadf10c2aa814eee0"} Mar 19 20:14:02 crc kubenswrapper[5033]: I0319 20:14:02.554425 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" event={"ID":"e31ccec7-6d67-4152-8128-afca55385d33","Type":"ContainerStarted","Data":"41c8d2dd39d728831efe5582486639e477e74bd5d1c414a0cba1fb6f3be69a74"} Mar 19 20:14:02 crc kubenswrapper[5033]: I0319 20:14:02.571766 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" podStartSLOduration=1.728110394 podStartE2EDuration="2.571750381s" podCreationTimestamp="2026-03-19 20:14:00 +0000 UTC" firstStartedPulling="2026-03-19 20:14:01.295752773 +0000 UTC m=+4651.400782622" lastFinishedPulling="2026-03-19 20:14:02.13939276 +0000 UTC m=+4652.244422609" observedRunningTime="2026-03-19 20:14:02.568599092 +0000 UTC m=+4652.673628941" watchObservedRunningTime="2026-03-19 20:14:02.571750381 +0000 UTC m=+4652.676780230" Mar 19 20:14:03 crc kubenswrapper[5033]: I0319 20:14:03.563895 5033 generic.go:334] "Generic (PLEG): container finished" podID="e31ccec7-6d67-4152-8128-afca55385d33" containerID="41c8d2dd39d728831efe5582486639e477e74bd5d1c414a0cba1fb6f3be69a74" exitCode=0 Mar 19 20:14:03 crc kubenswrapper[5033]: I0319 20:14:03.563937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" event={"ID":"e31ccec7-6d67-4152-8128-afca55385d33","Type":"ContainerDied","Data":"41c8d2dd39d728831efe5582486639e477e74bd5d1c414a0cba1fb6f3be69a74"} Mar 19 20:14:04 crc kubenswrapper[5033]: I0319 20:14:04.626913 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:14:04 crc kubenswrapper[5033]: E0319 20:14:04.628087 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.224689 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.262096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tm5w\" (UniqueName: \"kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w\") pod \"e31ccec7-6d67-4152-8128-afca55385d33\" (UID: \"e31ccec7-6d67-4152-8128-afca55385d33\") " Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.318560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w" (OuterVolumeSpecName: "kube-api-access-2tm5w") pod "e31ccec7-6d67-4152-8128-afca55385d33" (UID: "e31ccec7-6d67-4152-8128-afca55385d33"). InnerVolumeSpecName "kube-api-access-2tm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.395869 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tm5w\" (UniqueName: \"kubernetes.io/projected/e31ccec7-6d67-4152-8128-afca55385d33-kube-api-access-2tm5w\") on node \"crc\" DevicePath \"\"" Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.588344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" event={"ID":"e31ccec7-6d67-4152-8128-afca55385d33","Type":"ContainerDied","Data":"499a40960198681b49ab337c4592fd5d45e90f0cc265003cadf10c2aa814eee0"} Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.588404 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499a40960198681b49ab337c4592fd5d45e90f0cc265003cadf10c2aa814eee0" Mar 19 20:14:06 crc kubenswrapper[5033]: I0319 20:14:06.588377 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-xlsjz" Mar 19 20:14:07 crc kubenswrapper[5033]: I0319 20:14:07.300485 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-f5t2h"] Mar 19 20:14:07 crc kubenswrapper[5033]: I0319 20:14:07.310251 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-f5t2h"] Mar 19 20:14:08 crc kubenswrapper[5033]: I0319 20:14:08.631314 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a20f917-0f71-4350-9000-c676b2f640be" path="/var/lib/kubelet/pods/6a20f917-0f71-4350-9000-c676b2f640be/volumes" Mar 19 20:14:15 crc kubenswrapper[5033]: I0319 20:14:15.620270 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:14:15 crc kubenswrapper[5033]: E0319 20:14:15.621058 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.660749 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sdzkv/must-gather-5p5h2"] Mar 19 20:14:24 crc kubenswrapper[5033]: E0319 20:14:24.669992 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31ccec7-6d67-4152-8128-afca55385d33" containerName="oc" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.670029 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31ccec7-6d67-4152-8128-afca55385d33" containerName="oc" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.670317 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31ccec7-6d67-4152-8128-afca55385d33" containerName="oc" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.671553 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.673468 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sdzkv"/"default-dockercfg-x2w4j" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.675807 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sdzkv"/"kube-root-ca.crt" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.676029 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sdzkv"/"openshift-service-ca.crt" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.759334 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sdzkv/must-gather-5p5h2"] Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.766107 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.766166 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g67w\" (UniqueName: \"kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.870637 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.870682 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g67w\" (UniqueName: \"kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.871035 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.981589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g67w\" (UniqueName: \"kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w\") pod \"must-gather-5p5h2\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:24 crc kubenswrapper[5033]: I0319 20:14:24.989021 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:14:26 crc kubenswrapper[5033]: I0319 20:14:26.082712 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sdzkv/must-gather-5p5h2"] Mar 19 20:14:26 crc kubenswrapper[5033]: I0319 20:14:26.812947 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" event={"ID":"9e95d984-9a67-460b-a101-cfea0b90a4d5","Type":"ContainerStarted","Data":"41e2c5682d516cc989949c36d74d429908ba6de854b8b4693b662f57d63e2b72"} Mar 19 20:14:29 crc kubenswrapper[5033]: I0319 20:14:29.626227 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:14:29 crc kubenswrapper[5033]: E0319 20:14:29.637323 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:14:35 crc kubenswrapper[5033]: I0319 20:14:35.697311 5033 scope.go:117] "RemoveContainer" containerID="dcac3d1e6787ab430a948b897c2b87d1e780e7b80709ea28b1e367b6b1e4d480" Mar 19 20:14:41 crc kubenswrapper[5033]: I0319 20:14:41.621463 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:14:41 crc kubenswrapper[5033]: E0319 20:14:41.623438 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:14:42 crc kubenswrapper[5033]: I0319 20:14:42.988880 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" event={"ID":"9e95d984-9a67-460b-a101-cfea0b90a4d5","Type":"ContainerStarted","Data":"e225c7d2fa521bf56f8511b7bd7e707aab7cb341784689fe1f9e0dc4fcb40e31"} Mar 19 20:14:42 crc kubenswrapper[5033]: I0319 20:14:42.989260 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" event={"ID":"9e95d984-9a67-460b-a101-cfea0b90a4d5","Type":"ContainerStarted","Data":"7f1e0073f4bff60859d00487fffe3863027d69eed8a28138234934c2a894bc1d"} Mar 19 20:14:43 crc kubenswrapper[5033]: I0319 20:14:43.008355 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" podStartSLOduration=3.277572393 podStartE2EDuration="19.00833738s" podCreationTimestamp="2026-03-19 20:14:24 +0000 UTC" firstStartedPulling="2026-03-19 20:14:26.099027264 +0000 UTC m=+4676.204057113" lastFinishedPulling="2026-03-19 20:14:41.829792251 +0000 UTC m=+4691.934822100" observedRunningTime="2026-03-19 20:14:43.008210756 +0000 UTC m=+4693.113240605" watchObservedRunningTime="2026-03-19 20:14:43.00833738 +0000 UTC m=+4693.113367229" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.594120 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-d7mb4"] Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.596378 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.649034 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9kzq\" (UniqueName: \"kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.649106 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.750876 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9kzq\" (UniqueName: \"kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.750983 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.752520 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.774155 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9kzq\" (UniqueName: \"kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq\") pod \"crc-debug-d7mb4\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: I0319 20:14:50.923226 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:14:50 crc kubenswrapper[5033]: W0319 20:14:50.994793 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice/crio-52cfee5d34a5e3dac37c44f1dd0a6e8a58f472a3592a77d2b537cd1f36550cfd WatchSource:0}: Error finding container 52cfee5d34a5e3dac37c44f1dd0a6e8a58f472a3592a77d2b537cd1f36550cfd: Status 404 returned error can't find the container with id 52cfee5d34a5e3dac37c44f1dd0a6e8a58f472a3592a77d2b537cd1f36550cfd Mar 19 20:14:51 crc kubenswrapper[5033]: I0319 20:14:51.058261 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" event={"ID":"68b7039b-ceac-43c0-b8ac-40faaf00fc9e","Type":"ContainerStarted","Data":"52cfee5d34a5e3dac37c44f1dd0a6e8a58f472a3592a77d2b537cd1f36550cfd"} Mar 19 20:14:53 crc kubenswrapper[5033]: I0319 20:14:53.621048 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:14:53 crc kubenswrapper[5033]: E0319 20:14:53.621976 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.188695 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9"] Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.190738 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.201883 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9"] Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.208855 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.209368 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.243391 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7jq\" (UniqueName: \"kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.243463 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.243530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.344886 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7jq\" (UniqueName: \"kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.344935 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.344974 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.346058 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.364794 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.372361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7jq\" (UniqueName: \"kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq\") pod \"collect-profiles-29565855-whqw9\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:00 crc kubenswrapper[5033]: I0319 20:15:00.579593 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:08 crc kubenswrapper[5033]: I0319 20:15:08.620564 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:15:08 crc kubenswrapper[5033]: E0319 20:15:08.621492 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:15:10 crc kubenswrapper[5033]: E0319 20:15:10.308328 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 19 20:15:10 crc kubenswrapper[5033]: E0319 20:15:10.310128 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g9kzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-d7mb4_openshift-must-gather-sdzkv(68b7039b-ceac-43c0-b8ac-40faaf00fc9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:15:10 crc kubenswrapper[5033]: E0319 20:15:10.311969 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" Mar 19 20:15:11 crc kubenswrapper[5033]: I0319 20:15:11.092733 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9"] Mar 19 20:15:11 crc kubenswrapper[5033]: W0319 20:15:11.097585 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eba8606_e7e8_4ba0_beb8_3ce712c7570f.slice/crio-ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5 WatchSource:0}: Error finding container ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5: Status 404 returned error can't find the container with id ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5 Mar 19 20:15:11 crc kubenswrapper[5033]: I0319 20:15:11.301484 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" event={"ID":"6eba8606-e7e8-4ba0-beb8-3ce712c7570f","Type":"ContainerStarted","Data":"ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5"} Mar 19 20:15:11 crc kubenswrapper[5033]: E0319 20:15:11.303213 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" Mar 19 20:15:12 crc kubenswrapper[5033]: I0319 20:15:12.311187 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" event={"ID":"6eba8606-e7e8-4ba0-beb8-3ce712c7570f","Type":"ContainerStarted","Data":"b1e5f86f5d163645aea2b8268eda757a9d0e1bbdad4954863e411f15339dbb91"} Mar 19 20:15:12 crc kubenswrapper[5033]: I0319 20:15:12.330697 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" podStartSLOduration=12.330676629 podStartE2EDuration="12.330676629s" podCreationTimestamp="2026-03-19 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:15:12.322944171 +0000 UTC m=+4722.427974020" watchObservedRunningTime="2026-03-19 20:15:12.330676629 +0000 UTC m=+4722.435706478" Mar 19 20:15:13 crc kubenswrapper[5033]: I0319 20:15:13.319792 5033 generic.go:334] "Generic (PLEG): container finished" podID="6eba8606-e7e8-4ba0-beb8-3ce712c7570f" containerID="b1e5f86f5d163645aea2b8268eda757a9d0e1bbdad4954863e411f15339dbb91" exitCode=0 Mar 19 20:15:13 crc kubenswrapper[5033]: I0319 20:15:13.319894 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" event={"ID":"6eba8606-e7e8-4ba0-beb8-3ce712c7570f","Type":"ContainerDied","Data":"b1e5f86f5d163645aea2b8268eda757a9d0e1bbdad4954863e411f15339dbb91"} Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.467646 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.560009 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume\") pod \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.560097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume\") pod \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.560163 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7jq\" (UniqueName: \"kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq\") pod \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\" (UID: \"6eba8606-e7e8-4ba0-beb8-3ce712c7570f\") " Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.567901 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume" (OuterVolumeSpecName: "config-volume") pod "6eba8606-e7e8-4ba0-beb8-3ce712c7570f" (UID: "6eba8606-e7e8-4ba0-beb8-3ce712c7570f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.606120 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq" (OuterVolumeSpecName: "kube-api-access-cl7jq") pod "6eba8606-e7e8-4ba0-beb8-3ce712c7570f" (UID: "6eba8606-e7e8-4ba0-beb8-3ce712c7570f"). InnerVolumeSpecName "kube-api-access-cl7jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.635479 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6eba8606-e7e8-4ba0-beb8-3ce712c7570f" (UID: "6eba8606-e7e8-4ba0-beb8-3ce712c7570f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.667026 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.667072 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:15 crc kubenswrapper[5033]: I0319 20:15:15.667083 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7jq\" (UniqueName: \"kubernetes.io/projected/6eba8606-e7e8-4ba0-beb8-3ce712c7570f-kube-api-access-cl7jq\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.360849 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" event={"ID":"6eba8606-e7e8-4ba0-beb8-3ce712c7570f","Type":"ContainerDied","Data":"ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5"} Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.361199 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab4b1d95e4235336b23be62a5d70f880848bfef1f38fef506ae1534ad57c13b5" Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.360890 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-whqw9" Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.546543 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb"] Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.561745 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-ndjfb"] Mar 19 20:15:16 crc kubenswrapper[5033]: I0319 20:15:16.632878 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d8a751-7b1a-4169-bbb2-a052502ec1af" path="/var/lib/kubelet/pods/f9d8a751-7b1a-4169-bbb2-a052502ec1af/volumes" Mar 19 20:15:20 crc kubenswrapper[5033]: I0319 20:15:20.644676 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:15:20 crc kubenswrapper[5033]: E0319 20:15:20.646301 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:15:23 crc kubenswrapper[5033]: I0319 20:15:23.423619 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" event={"ID":"68b7039b-ceac-43c0-b8ac-40faaf00fc9e","Type":"ContainerStarted","Data":"68750624400e974e1173cf9836b600c8b04119cb851a79d046e18dfad7bce94c"} Mar 19 20:15:23 crc kubenswrapper[5033]: I0319 20:15:23.446762 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" podStartSLOduration=1.4033499950000001 podStartE2EDuration="33.446742571s" podCreationTimestamp="2026-03-19 20:14:50 +0000 UTC" firstStartedPulling="2026-03-19 20:14:50.999929151 +0000 UTC m=+4701.104959000" lastFinishedPulling="2026-03-19 20:15:23.043321737 +0000 UTC m=+4733.148351576" observedRunningTime="2026-03-19 20:15:23.440501135 +0000 UTC m=+4733.545530994" watchObservedRunningTime="2026-03-19 20:15:23.446742571 +0000 UTC m=+4733.551772420" Mar 19 20:15:35 crc kubenswrapper[5033]: I0319 20:15:35.621224 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:15:35 crc kubenswrapper[5033]: E0319 20:15:35.622121 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:15:41 crc kubenswrapper[5033]: I0319 20:15:41.820254 5033 scope.go:117] "RemoveContainer" containerID="d627501252ac9a6fd3b9365d09f4444295a6e8b033282ebebf81550165f17836" Mar 19 20:15:48 crc kubenswrapper[5033]: I0319 20:15:48.621200 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:15:48 crc kubenswrapper[5033]: E0319 20:15:48.622170 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.144764 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565856-xxmqn"] Mar 19 20:16:00 crc kubenswrapper[5033]: E0319 20:16:00.145681 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eba8606-e7e8-4ba0-beb8-3ce712c7570f" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.145694 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eba8606-e7e8-4ba0-beb8-3ce712c7570f" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.145936 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eba8606-e7e8-4ba0-beb8-3ce712c7570f" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.147415 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.157218 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-xxmqn"] Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.160627 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.160645 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.160885 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.203301 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4cjb\" (UniqueName: \"kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb\") pod \"auto-csr-approver-29565856-xxmqn\" (UID: \"32c9821a-4188-4e83-87f2-3ce4c0158d02\") " pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.305329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4cjb\" (UniqueName: \"kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb\") pod \"auto-csr-approver-29565856-xxmqn\" (UID: \"32c9821a-4188-4e83-87f2-3ce4c0158d02\") " pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:00 crc kubenswrapper[5033]: I0319 20:16:00.882209 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4cjb\" (UniqueName: \"kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb\") pod \"auto-csr-approver-29565856-xxmqn\" (UID: \"32c9821a-4188-4e83-87f2-3ce4c0158d02\") " pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:01 crc kubenswrapper[5033]: I0319 20:16:01.068013 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:02 crc kubenswrapper[5033]: I0319 20:16:02.007071 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-xxmqn"] Mar 19 20:16:02 crc kubenswrapper[5033]: I0319 20:16:02.831124 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" event={"ID":"32c9821a-4188-4e83-87f2-3ce4c0158d02","Type":"ContainerStarted","Data":"5b30fda661d67530dedcce046c13fc18cf9b2cf168dba21b48821b57886096b4"} Mar 19 20:16:03 crc kubenswrapper[5033]: I0319 20:16:03.621020 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:16:03 crc kubenswrapper[5033]: E0319 20:16:03.621722 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:03 crc kubenswrapper[5033]: I0319 20:16:03.840585 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" event={"ID":"32c9821a-4188-4e83-87f2-3ce4c0158d02","Type":"ContainerStarted","Data":"e480109e86113cf34e7797564dd40d30e79d579cf38467bc53ec2c3879f6daa3"} Mar 19 20:16:03 crc kubenswrapper[5033]: I0319 20:16:03.859482 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" podStartSLOduration=2.43332073 podStartE2EDuration="3.859423654s" podCreationTimestamp="2026-03-19 20:16:00 +0000 UTC" firstStartedPulling="2026-03-19 20:16:02.012075763 +0000 UTC m=+4772.117105612" lastFinishedPulling="2026-03-19 20:16:03.438178687 +0000 UTC m=+4773.543208536" observedRunningTime="2026-03-19 20:16:03.852781997 +0000 UTC m=+4773.957811836" watchObservedRunningTime="2026-03-19 20:16:03.859423654 +0000 UTC m=+4773.964453503" Mar 19 20:16:05 crc kubenswrapper[5033]: I0319 20:16:05.860101 5033 generic.go:334] "Generic (PLEG): container finished" podID="32c9821a-4188-4e83-87f2-3ce4c0158d02" containerID="e480109e86113cf34e7797564dd40d30e79d579cf38467bc53ec2c3879f6daa3" exitCode=0 Mar 19 20:16:05 crc kubenswrapper[5033]: I0319 20:16:05.860290 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" event={"ID":"32c9821a-4188-4e83-87f2-3ce4c0158d02","Type":"ContainerDied","Data":"e480109e86113cf34e7797564dd40d30e79d579cf38467bc53ec2c3879f6daa3"} Mar 19 20:16:07 crc kubenswrapper[5033]: I0319 20:16:07.995501 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.172800 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4cjb\" (UniqueName: \"kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb\") pod \"32c9821a-4188-4e83-87f2-3ce4c0158d02\" (UID: \"32c9821a-4188-4e83-87f2-3ce4c0158d02\") " Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.179433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb" (OuterVolumeSpecName: "kube-api-access-f4cjb") pod "32c9821a-4188-4e83-87f2-3ce4c0158d02" (UID: "32c9821a-4188-4e83-87f2-3ce4c0158d02"). InnerVolumeSpecName "kube-api-access-f4cjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.275652 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4cjb\" (UniqueName: \"kubernetes.io/projected/32c9821a-4188-4e83-87f2-3ce4c0158d02-kube-api-access-f4cjb\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.427698 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:08 crc kubenswrapper[5033]: E0319 20:16:08.428202 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c9821a-4188-4e83-87f2-3ce4c0158d02" containerName="oc" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.428218 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c9821a-4188-4e83-87f2-3ce4c0158d02" containerName="oc" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.428558 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c9821a-4188-4e83-87f2-3ce4c0158d02" containerName="oc" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.430151 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.440758 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.581438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.581809 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.582384 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4t5x\" (UniqueName: \"kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.684319 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4t5x\" (UniqueName: \"kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.684422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.684555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.684975 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.684989 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.702316 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4t5x\" (UniqueName: \"kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x\") pod \"redhat-marketplace-txlbt\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.747757 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.910203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" event={"ID":"32c9821a-4188-4e83-87f2-3ce4c0158d02","Type":"ContainerDied","Data":"5b30fda661d67530dedcce046c13fc18cf9b2cf168dba21b48821b57886096b4"} Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.910492 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b30fda661d67530dedcce046c13fc18cf9b2cf168dba21b48821b57886096b4" Mar 19 20:16:08 crc kubenswrapper[5033]: I0319 20:16:08.910545 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-xxmqn" Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.076348 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-wdtdf"] Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.085232 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-wdtdf"] Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.504383 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.927555 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerID="9ce6895889a7fe8536cf9304e3581db4c9d65b9c690516a787fc245c3a1753d6" exitCode=0 Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.927757 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerDied","Data":"9ce6895889a7fe8536cf9304e3581db4c9d65b9c690516a787fc245c3a1753d6"} Mar 19 20:16:09 crc kubenswrapper[5033]: I0319 20:16:09.927884 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerStarted","Data":"45b70269e107f6983f9640533d0d62da8c89d59c3506d548bd7ac97b54fde730"} Mar 19 20:16:10 crc kubenswrapper[5033]: I0319 20:16:10.637778 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99" path="/var/lib/kubelet/pods/7aaa0936-96bd-4f0c-b7eb-9abbd5ce2f99/volumes" Mar 19 20:16:11 crc kubenswrapper[5033]: I0319 20:16:11.951884 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerStarted","Data":"5853d829152c8e1616f9443015e715b2e7cc98ca15f016aaf11b728c6954efda"} Mar 19 20:16:12 crc kubenswrapper[5033]: E0319 20:16:12.516569 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1309968_ea75_49a8_977f_3e8a7acd6868.slice/crio-5853d829152c8e1616f9443015e715b2e7cc98ca15f016aaf11b728c6954efda.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:16:12 crc kubenswrapper[5033]: I0319 20:16:12.960756 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerID="5853d829152c8e1616f9443015e715b2e7cc98ca15f016aaf11b728c6954efda" exitCode=0 Mar 19 20:16:12 crc kubenswrapper[5033]: I0319 20:16:12.960846 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerDied","Data":"5853d829152c8e1616f9443015e715b2e7cc98ca15f016aaf11b728c6954efda"} Mar 19 20:16:13 crc kubenswrapper[5033]: I0319 20:16:13.973008 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerStarted","Data":"7a7cae33bb94dabaec05872b6aac9cb961e25e6af6b09b323694718574f94f48"} Mar 19 20:16:13 crc kubenswrapper[5033]: I0319 20:16:13.993644 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txlbt" podStartSLOduration=2.59533002 podStartE2EDuration="5.993628439s" podCreationTimestamp="2026-03-19 20:16:08 +0000 UTC" firstStartedPulling="2026-03-19 20:16:09.929801669 +0000 UTC m=+4780.034831518" lastFinishedPulling="2026-03-19 20:16:13.328100088 +0000 UTC m=+4783.433129937" observedRunningTime="2026-03-19 20:16:13.991324614 +0000 UTC m=+4784.096354473" watchObservedRunningTime="2026-03-19 20:16:13.993628439 +0000 UTC m=+4784.098658278" Mar 19 20:16:14 crc kubenswrapper[5033]: I0319 20:16:14.621332 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:16:14 crc kubenswrapper[5033]: E0319 20:16:14.621869 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:18 crc kubenswrapper[5033]: I0319 20:16:18.760547 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:18 crc kubenswrapper[5033]: I0319 20:16:18.761098 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:19 crc kubenswrapper[5033]: I0319 20:16:19.810071 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-txlbt" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:19 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:19 crc kubenswrapper[5033]: > Mar 19 20:16:27 crc kubenswrapper[5033]: I0319 20:16:27.621026 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:16:27 crc kubenswrapper[5033]: E0319 20:16:27.621818 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:28 crc kubenswrapper[5033]: I0319 20:16:28.815708 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:28 crc kubenswrapper[5033]: I0319 20:16:28.866461 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:29 crc kubenswrapper[5033]: I0319 20:16:29.055796 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:29 crc kubenswrapper[5033]: I0319 20:16:29.100057 5033 generic.go:334] "Generic (PLEG): container finished" podID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" containerID="68750624400e974e1173cf9836b600c8b04119cb851a79d046e18dfad7bce94c" exitCode=0 Mar 19 20:16:29 crc kubenswrapper[5033]: I0319 20:16:29.100913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" event={"ID":"68b7039b-ceac-43c0-b8ac-40faaf00fc9e","Type":"ContainerDied","Data":"68750624400e974e1173cf9836b600c8b04119cb851a79d046e18dfad7bce94c"} Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.115028 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txlbt" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="registry-server" containerID="cri-o://7a7cae33bb94dabaec05872b6aac9cb961e25e6af6b09b323694718574f94f48" gracePeriod=2 Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.307311 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.473784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9kzq\" (UniqueName: \"kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq\") pod \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.474158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host\") pod \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\" (UID: \"68b7039b-ceac-43c0-b8ac-40faaf00fc9e\") " Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.475574 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host" (OuterVolumeSpecName: "host") pod "68b7039b-ceac-43c0-b8ac-40faaf00fc9e" (UID: "68b7039b-ceac-43c0-b8ac-40faaf00fc9e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.475940 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.493594 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq" (OuterVolumeSpecName: "kube-api-access-g9kzq") pod "68b7039b-ceac-43c0-b8ac-40faaf00fc9e" (UID: "68b7039b-ceac-43c0-b8ac-40faaf00fc9e"). InnerVolumeSpecName "kube-api-access-g9kzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.512821 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-d7mb4"] Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.531328 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-d7mb4"] Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.577962 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9kzq\" (UniqueName: \"kubernetes.io/projected/68b7039b-ceac-43c0-b8ac-40faaf00fc9e-kube-api-access-g9kzq\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:30 crc kubenswrapper[5033]: I0319 20:16:30.642889 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" path="/var/lib/kubelet/pods/68b7039b-ceac-43c0-b8ac-40faaf00fc9e/volumes" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.126227 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerID="7a7cae33bb94dabaec05872b6aac9cb961e25e6af6b09b323694718574f94f48" exitCode=0 Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.126276 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerDied","Data":"7a7cae33bb94dabaec05872b6aac9cb961e25e6af6b09b323694718574f94f48"} Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.128557 5033 scope.go:117] "RemoveContainer" containerID="68750624400e974e1173cf9836b600c8b04119cb851a79d046e18dfad7bce94c" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.128630 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-d7mb4" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.524738 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.606261 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities\") pod \"b1309968-ea75-49a8-977f-3e8a7acd6868\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.606535 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content\") pod \"b1309968-ea75-49a8-977f-3e8a7acd6868\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.606575 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4t5x\" (UniqueName: \"kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x\") pod \"b1309968-ea75-49a8-977f-3e8a7acd6868\" (UID: \"b1309968-ea75-49a8-977f-3e8a7acd6868\") " Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.607900 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities" (OuterVolumeSpecName: "utilities") pod "b1309968-ea75-49a8-977f-3e8a7acd6868" (UID: "b1309968-ea75-49a8-977f-3e8a7acd6868"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.636668 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x" (OuterVolumeSpecName: "kube-api-access-j4t5x") pod "b1309968-ea75-49a8-977f-3e8a7acd6868" (UID: "b1309968-ea75-49a8-977f-3e8a7acd6868"). InnerVolumeSpecName "kube-api-access-j4t5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.650814 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1309968-ea75-49a8-977f-3e8a7acd6868" (UID: "b1309968-ea75-49a8-977f-3e8a7acd6868"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.709242 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.709608 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4t5x\" (UniqueName: \"kubernetes.io/projected/b1309968-ea75-49a8-977f-3e8a7acd6868-kube-api-access-j4t5x\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.709620 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1309968-ea75-49a8-977f-3e8a7acd6868-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753237 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-h4qsv"] Mar 19 20:16:31 crc kubenswrapper[5033]: E0319 20:16:31.753661 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="registry-server" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753678 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="registry-server" Mar 19 20:16:31 crc kubenswrapper[5033]: E0319 20:16:31.753690 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" containerName="container-00" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753696 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" containerName="container-00" Mar 19 20:16:31 crc kubenswrapper[5033]: E0319 20:16:31.753715 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="extract-content" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753722 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="extract-content" Mar 19 20:16:31 crc kubenswrapper[5033]: E0319 20:16:31.753733 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="extract-utilities" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753740 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="extract-utilities" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753942 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b7039b-ceac-43c0-b8ac-40faaf00fc9e" containerName="container-00" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.753969 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" containerName="registry-server" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.754666 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.811652 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknzb\" (UniqueName: \"kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.811787 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.913226 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.913338 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknzb\" (UniqueName: \"kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.913399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:31 crc kubenswrapper[5033]: I0319 20:16:31.931801 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknzb\" (UniqueName: \"kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb\") pod \"crc-debug-h4qsv\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.068025 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.146123 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" event={"ID":"de3c5159-df2d-49b6-b49b-c4a9242c9c40","Type":"ContainerStarted","Data":"8935d334c94cf4f1695cdbe1293de93f06a63a20a98dd3a8cd1a9636b84c4b81"} Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.148395 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txlbt" event={"ID":"b1309968-ea75-49a8-977f-3e8a7acd6868","Type":"ContainerDied","Data":"45b70269e107f6983f9640533d0d62da8c89d59c3506d548bd7ac97b54fde730"} Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.148435 5033 scope.go:117] "RemoveContainer" containerID="7a7cae33bb94dabaec05872b6aac9cb961e25e6af6b09b323694718574f94f48" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.148561 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txlbt" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.248839 5033 scope.go:117] "RemoveContainer" containerID="5853d829152c8e1616f9443015e715b2e7cc98ca15f016aaf11b728c6954efda" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.270477 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.289416 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txlbt"] Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.309101 5033 scope.go:117] "RemoveContainer" containerID="9ce6895889a7fe8536cf9304e3581db4c9d65b9c690516a787fc245c3a1753d6" Mar 19 20:16:32 crc kubenswrapper[5033]: I0319 20:16:32.633771 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1309968-ea75-49a8-977f-3e8a7acd6868" path="/var/lib/kubelet/pods/b1309968-ea75-49a8-977f-3e8a7acd6868/volumes" Mar 19 20:16:33 crc kubenswrapper[5033]: E0319 20:16:33.079253 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:16:33 crc kubenswrapper[5033]: I0319 20:16:33.176495 5033 generic.go:334] "Generic (PLEG): container finished" podID="de3c5159-df2d-49b6-b49b-c4a9242c9c40" containerID="5a789f844f4a4226f3869d831a4caa4b00d45c0bca6037689dfb4fc72377ef56" exitCode=0 Mar 19 20:16:33 crc kubenswrapper[5033]: I0319 20:16:33.177037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" event={"ID":"de3c5159-df2d-49b6-b49b-c4a9242c9c40","Type":"ContainerDied","Data":"5a789f844f4a4226f3869d831a4caa4b00d45c0bca6037689dfb4fc72377ef56"} Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.305400 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.363567 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host\") pod \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.363745 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host" (OuterVolumeSpecName: "host") pod "de3c5159-df2d-49b6-b49b-c4a9242c9c40" (UID: "de3c5159-df2d-49b6-b49b-c4a9242c9c40"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.363772 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknzb\" (UniqueName: \"kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb\") pod \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\" (UID: \"de3c5159-df2d-49b6-b49b-c4a9242c9c40\") " Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.364359 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3c5159-df2d-49b6-b49b-c4a9242c9c40-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.377523 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb" (OuterVolumeSpecName: "kube-api-access-hknzb") pod "de3c5159-df2d-49b6-b49b-c4a9242c9c40" (UID: "de3c5159-df2d-49b6-b49b-c4a9242c9c40"). InnerVolumeSpecName "kube-api-access-hknzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.410936 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-h4qsv"] Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.433734 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-h4qsv"] Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.465426 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:34 crc kubenswrapper[5033]: E0319 20:16:34.465835 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3c5159-df2d-49b6-b49b-c4a9242c9c40" containerName="container-00" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.465850 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3c5159-df2d-49b6-b49b-c4a9242c9c40" containerName="container-00" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.466075 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3c5159-df2d-49b6-b49b-c4a9242c9c40" containerName="container-00" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.466604 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hknzb\" (UniqueName: \"kubernetes.io/projected/de3c5159-df2d-49b6-b49b-c4a9242c9c40-kube-api-access-hknzb\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.467591 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.492161 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.569089 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.569169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.569206 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpz7\" (UniqueName: \"kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.635609 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3c5159-df2d-49b6-b49b-c4a9242c9c40" path="/var/lib/kubelet/pods/de3c5159-df2d-49b6-b49b-c4a9242c9c40/volumes" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.671952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.672020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.672044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpz7\" (UniqueName: \"kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.672523 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.672796 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.712345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpz7\" (UniqueName: \"kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7\") pod \"certified-operators-4xwnr\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:34 crc kubenswrapper[5033]: I0319 20:16:34.790032 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.196836 5033 scope.go:117] "RemoveContainer" containerID="5a789f844f4a4226f3869d831a4caa4b00d45c0bca6037689dfb4fc72377ef56" Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.196870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-h4qsv" Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.586494 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:35 crc kubenswrapper[5033]: W0319 20:16:35.592539 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdb042c_2486_439b_b5a7_499780683fed.slice/crio-5d26d10acdcc432f7902ecd083c02a5ccbf78fce25e6df3afd3d4718ce0241ba WatchSource:0}: Error finding container 5d26d10acdcc432f7902ecd083c02a5ccbf78fce25e6df3afd3d4718ce0241ba: Status 404 returned error can't find the container with id 5d26d10acdcc432f7902ecd083c02a5ccbf78fce25e6df3afd3d4718ce0241ba Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.871781 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-gkt9q"] Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.873118 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.904147 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvm6\" (UniqueName: \"kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:35 crc kubenswrapper[5033]: I0319 20:16:35.904232 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.006015 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvm6\" (UniqueName: \"kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.006097 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.006426 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.039000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvm6\" (UniqueName: \"kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6\") pod \"crc-debug-gkt9q\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.187923 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.253950 5033 generic.go:334] "Generic (PLEG): container finished" podID="8bdb042c-2486-439b-b5a7-499780683fed" containerID="26cd92e35c4219e6d6955ec107d9f5d6379d43a95de28add7fef1e591552a8a9" exitCode=0 Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.254603 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerDied","Data":"26cd92e35c4219e6d6955ec107d9f5d6379d43a95de28add7fef1e591552a8a9"} Mar 19 20:16:36 crc kubenswrapper[5033]: I0319 20:16:36.254682 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerStarted","Data":"5d26d10acdcc432f7902ecd083c02a5ccbf78fce25e6df3afd3d4718ce0241ba"} Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.279051 5033 generic.go:334] "Generic (PLEG): container finished" podID="b40af062-a898-4af4-91a3-2b5d09198608" containerID="d26a70600b84f2ed31389ebbd718310876c685bc47465568ac5f9b9fe37deb1e" exitCode=0 Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.279153 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" event={"ID":"b40af062-a898-4af4-91a3-2b5d09198608","Type":"ContainerDied","Data":"d26a70600b84f2ed31389ebbd718310876c685bc47465568ac5f9b9fe37deb1e"} Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.279601 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" event={"ID":"b40af062-a898-4af4-91a3-2b5d09198608","Type":"ContainerStarted","Data":"efe16cf44f6f00c2d0074bee64506dc4cb9bebd3e2ef60b7eba9028a61709441"} Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.281666 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerStarted","Data":"ac750611ccb63abc6c9da306057b8ada974003f7b82aecca6a92c7059a93ff33"} Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.341060 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-gkt9q"] Mar 19 20:16:37 crc kubenswrapper[5033]: I0319 20:16:37.354220 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sdzkv/crc-debug-gkt9q"] Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.403052 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.507082 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsvm6\" (UniqueName: \"kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6\") pod \"b40af062-a898-4af4-91a3-2b5d09198608\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.507239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host\") pod \"b40af062-a898-4af4-91a3-2b5d09198608\" (UID: \"b40af062-a898-4af4-91a3-2b5d09198608\") " Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.507371 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host" (OuterVolumeSpecName: "host") pod "b40af062-a898-4af4-91a3-2b5d09198608" (UID: "b40af062-a898-4af4-91a3-2b5d09198608"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.507797 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b40af062-a898-4af4-91a3-2b5d09198608-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.523176 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6" (OuterVolumeSpecName: "kube-api-access-vsvm6") pod "b40af062-a898-4af4-91a3-2b5d09198608" (UID: "b40af062-a898-4af4-91a3-2b5d09198608"). InnerVolumeSpecName "kube-api-access-vsvm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.609170 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsvm6\" (UniqueName: \"kubernetes.io/projected/b40af062-a898-4af4-91a3-2b5d09198608-kube-api-access-vsvm6\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.621055 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:16:38 crc kubenswrapper[5033]: E0319 20:16:38.621338 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:38 crc kubenswrapper[5033]: I0319 20:16:38.632211 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40af062-a898-4af4-91a3-2b5d09198608" path="/var/lib/kubelet/pods/b40af062-a898-4af4-91a3-2b5d09198608/volumes" Mar 19 20:16:39 crc kubenswrapper[5033]: I0319 20:16:39.298809 5033 scope.go:117] "RemoveContainer" containerID="d26a70600b84f2ed31389ebbd718310876c685bc47465568ac5f9b9fe37deb1e" Mar 19 20:16:39 crc kubenswrapper[5033]: I0319 20:16:39.298827 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/crc-debug-gkt9q" Mar 19 20:16:39 crc kubenswrapper[5033]: I0319 20:16:39.301282 5033 generic.go:334] "Generic (PLEG): container finished" podID="8bdb042c-2486-439b-b5a7-499780683fed" containerID="ac750611ccb63abc6c9da306057b8ada974003f7b82aecca6a92c7059a93ff33" exitCode=0 Mar 19 20:16:39 crc kubenswrapper[5033]: I0319 20:16:39.301322 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerDied","Data":"ac750611ccb63abc6c9da306057b8ada974003f7b82aecca6a92c7059a93ff33"} Mar 19 20:16:40 crc kubenswrapper[5033]: I0319 20:16:40.329709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerStarted","Data":"553194e2fdf4ed26886991f9ab224879febd3ef7793b6d085f427c9c926949f3"} Mar 19 20:16:40 crc kubenswrapper[5033]: I0319 20:16:40.352546 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xwnr" podStartSLOduration=2.814204079 podStartE2EDuration="6.35252778s" podCreationTimestamp="2026-03-19 20:16:34 +0000 UTC" firstStartedPulling="2026-03-19 20:16:36.269415825 +0000 UTC m=+4806.374445674" lastFinishedPulling="2026-03-19 20:16:39.807739536 +0000 UTC m=+4809.912769375" observedRunningTime="2026-03-19 20:16:40.349191356 +0000 UTC m=+4810.454221205" watchObservedRunningTime="2026-03-19 20:16:40.35252778 +0000 UTC m=+4810.457557629" Mar 19 20:16:42 crc kubenswrapper[5033]: I0319 20:16:42.116711 5033 scope.go:117] "RemoveContainer" containerID="c1e14358f99c9d7203c4566b26e6758d7c77501352f7cec22f032280bf649424" Mar 19 20:16:43 crc kubenswrapper[5033]: E0319 20:16:43.465304 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:16:44 crc kubenswrapper[5033]: I0319 20:16:44.792712 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:44 crc kubenswrapper[5033]: I0319 20:16:44.802137 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:44 crc kubenswrapper[5033]: I0319 20:16:44.853794 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:45 crc kubenswrapper[5033]: I0319 20:16:45.426199 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:45 crc kubenswrapper[5033]: I0319 20:16:45.492747 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:47 crc kubenswrapper[5033]: I0319 20:16:47.388248 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xwnr" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="registry-server" containerID="cri-o://553194e2fdf4ed26886991f9ab224879febd3ef7793b6d085f427c9c926949f3" gracePeriod=2 Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.410961 5033 generic.go:334] "Generic (PLEG): container finished" podID="8bdb042c-2486-439b-b5a7-499780683fed" containerID="553194e2fdf4ed26886991f9ab224879febd3ef7793b6d085f427c9c926949f3" exitCode=0 Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.411024 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerDied","Data":"553194e2fdf4ed26886991f9ab224879febd3ef7793b6d085f427c9c926949f3"} Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.676274 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.834088 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content\") pod \"8bdb042c-2486-439b-b5a7-499780683fed\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.834422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities\") pod \"8bdb042c-2486-439b-b5a7-499780683fed\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.834526 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rpz7\" (UniqueName: \"kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7\") pod \"8bdb042c-2486-439b-b5a7-499780683fed\" (UID: \"8bdb042c-2486-439b-b5a7-499780683fed\") " Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.835020 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities" (OuterVolumeSpecName: "utilities") pod "8bdb042c-2486-439b-b5a7-499780683fed" (UID: "8bdb042c-2486-439b-b5a7-499780683fed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.835320 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.847709 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7" (OuterVolumeSpecName: "kube-api-access-7rpz7") pod "8bdb042c-2486-439b-b5a7-499780683fed" (UID: "8bdb042c-2486-439b-b5a7-499780683fed"). InnerVolumeSpecName "kube-api-access-7rpz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.896874 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bdb042c-2486-439b-b5a7-499780683fed" (UID: "8bdb042c-2486-439b-b5a7-499780683fed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.937409 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdb042c-2486-439b-b5a7-499780683fed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:48 crc kubenswrapper[5033]: I0319 20:16:48.937439 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rpz7\" (UniqueName: \"kubernetes.io/projected/8bdb042c-2486-439b-b5a7-499780683fed-kube-api-access-7rpz7\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.419781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xwnr" event={"ID":"8bdb042c-2486-439b-b5a7-499780683fed","Type":"ContainerDied","Data":"5d26d10acdcc432f7902ecd083c02a5ccbf78fce25e6df3afd3d4718ce0241ba"} Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.419830 5033 scope.go:117] "RemoveContainer" containerID="553194e2fdf4ed26886991f9ab224879febd3ef7793b6d085f427c9c926949f3" Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.419852 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xwnr" Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.443598 5033 scope.go:117] "RemoveContainer" containerID="ac750611ccb63abc6c9da306057b8ada974003f7b82aecca6a92c7059a93ff33" Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.476432 5033 scope.go:117] "RemoveContainer" containerID="26cd92e35c4219e6d6955ec107d9f5d6379d43a95de28add7fef1e591552a8a9" Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.485368 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:49 crc kubenswrapper[5033]: I0319 20:16:49.507091 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xwnr"] Mar 19 20:16:50 crc kubenswrapper[5033]: I0319 20:16:50.639110 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdb042c-2486-439b-b5a7-499780683fed" path="/var/lib/kubelet/pods/8bdb042c-2486-439b-b5a7-499780683fed/volumes" Mar 19 20:16:53 crc kubenswrapper[5033]: I0319 20:16:53.622274 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:16:53 crc kubenswrapper[5033]: E0319 20:16:53.622992 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:16:53 crc kubenswrapper[5033]: E0319 20:16:53.736285 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:17:03 crc kubenswrapper[5033]: E0319 20:17:03.981215 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:17:07 crc kubenswrapper[5033]: I0319 20:17:07.621078 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:17:07 crc kubenswrapper[5033]: E0319 20:17:07.621830 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:17:14 crc kubenswrapper[5033]: E0319 20:17:14.230596 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:17:22 crc kubenswrapper[5033]: I0319 20:17:22.621064 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:17:22 crc kubenswrapper[5033]: E0319 20:17:22.621791 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:17:24 crc kubenswrapper[5033]: E0319 20:17:24.484087 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b7039b_ceac_43c0_b8ac_40faaf00fc9e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:17:37 crc kubenswrapper[5033]: I0319 20:17:37.621328 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:17:37 crc kubenswrapper[5033]: E0319 20:17:37.621951 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:17:45 crc kubenswrapper[5033]: I0319 20:17:45.260554 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_1b5fab5b-14ba-4b0a-adb3-f4bad7edac99/init-config-reloader/0.log" Mar 19 20:17:45 crc kubenswrapper[5033]: I0319 20:17:45.803818 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_1b5fab5b-14ba-4b0a-adb3-f4bad7edac99/init-config-reloader/0.log" Mar 19 20:17:45 crc kubenswrapper[5033]: I0319 20:17:45.810467 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_1b5fab5b-14ba-4b0a-adb3-f4bad7edac99/config-reloader/0.log" Mar 19 20:17:45 crc kubenswrapper[5033]: I0319 20:17:45.847367 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_1b5fab5b-14ba-4b0a-adb3-f4bad7edac99/alertmanager/0.log" Mar 19 20:17:46 crc kubenswrapper[5033]: I0319 20:17:46.179663 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc86df956-fh6cw_646984ab-e574-4cab-8933-8c5ba324f84c/barbican-api-log/0.log" Mar 19 20:17:46 crc kubenswrapper[5033]: I0319 20:17:46.237947 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cc86df956-fh6cw_646984ab-e574-4cab-8933-8c5ba324f84c/barbican-api/0.log" Mar 19 20:17:46 crc kubenswrapper[5033]: I0319 20:17:46.844682 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d4b6bf66b-qkrnm_cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8/barbican-keystone-listener/0.log" Mar 19 20:17:46 crc kubenswrapper[5033]: I0319 20:17:46.927825 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d4b6bf66b-qkrnm_cebc126d-43aa-4c9c-9a38-6ce2b4ee18e8/barbican-keystone-listener-log/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.137070 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7764867bbc-cjkpd_dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b/barbican-worker-log/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.149814 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7764867bbc-cjkpd_dee8a19b-9182-4cce-b7b5-5aad4cf5ff8b/barbican-worker/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.338262 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-snswf_78f1dc7a-f38b-4b03-ad2f-3cc0fd7d7803/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.678326 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2084df77-794b-44e4-92a5-16ccb442b1ee/ceilometer-notification-agent/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.776332 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2084df77-794b-44e4-92a5-16ccb442b1ee/ceilometer-central-agent/0.log" Mar 19 20:17:47 crc kubenswrapper[5033]: I0319 20:17:47.954257 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2084df77-794b-44e4-92a5-16ccb442b1ee/proxy-httpd/0.log" Mar 19 20:17:48 crc kubenswrapper[5033]: I0319 20:17:48.110516 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2084df77-794b-44e4-92a5-16ccb442b1ee/sg-core/0.log" Mar 19 20:17:48 crc kubenswrapper[5033]: I0319 20:17:48.486899 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_83e7c9da-f095-4e81-8284-feccedc40ce4/cinder-api/0.log" Mar 19 20:17:48 crc kubenswrapper[5033]: I0319 20:17:48.632966 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_83e7c9da-f095-4e81-8284-feccedc40ce4/cinder-api-log/0.log" Mar 19 20:17:49 crc kubenswrapper[5033]: I0319 20:17:49.025242 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85d96370-ae45-4b23-b625-71562229c174/cinder-scheduler/0.log" Mar 19 20:17:49 crc kubenswrapper[5033]: I0319 20:17:49.123059 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85d96370-ae45-4b23-b625-71562229c174/probe/0.log" Mar 19 20:17:49 crc kubenswrapper[5033]: I0319 20:17:49.452289 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0/cloudkitty-api/0.log" Mar 19 20:17:49 crc kubenswrapper[5033]: I0319 20:17:49.578530 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_fc506eb7-d9c5-4db4-9707-53ff8923ef3b/loki-compactor/0.log" Mar 19 20:17:49 crc kubenswrapper[5033]: I0319 20:17:49.588479 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_2c4f56d2-60d1-4e1c-87a4-a1efd92c37f0/cloudkitty-api-log/0.log" Mar 19 20:17:50 crc kubenswrapper[5033]: I0319 20:17:50.038934 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-5d547bbd4d-n5sh5_97b0c498-1aed-420e-922d-9d04f4ac6c63/loki-distributor/0.log" Mar 19 20:17:50 crc kubenswrapper[5033]: I0319 20:17:50.283389 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-k4z5t_4b6b27fd-56c6-4473-be2b-eb469e816a08/gateway/0.log" Mar 19 20:17:50 crc kubenswrapper[5033]: I0319 20:17:50.485591 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-tjjl9_74fb1224-a73c-47ca-ac3b-d23ed2116a84/gateway/0.log" Mar 19 20:17:50 crc kubenswrapper[5033]: I0319 20:17:50.930880 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_7574570b-6325-4897-a35c-5712967a74f3/loki-index-gateway/0.log" Mar 19 20:17:51 crc kubenswrapper[5033]: I0319 20:17:51.137307 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_31733aba-46c2-4129-9088-e294daafa285/loki-ingester/0.log" Mar 19 20:17:51 crc kubenswrapper[5033]: I0319 20:17:51.620385 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:17:51 crc kubenswrapper[5033]: E0319 20:17:51.620843 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:17:51 crc kubenswrapper[5033]: I0319 20:17:51.709976 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-6f54889599-2pgf8_1da94e31-ccb7-43e3-a22c-36d9d9a35933/loki-query-frontend/0.log" Mar 19 20:17:52 crc kubenswrapper[5033]: I0319 20:17:52.614647 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2f2pv_7e310e51-d5e9-4a0d-9ac7-246d34af93b5/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:52 crc kubenswrapper[5033]: I0319 20:17:52.633722 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-668f98fdd7-t65p9_58e55e58-fd66-4bac-9461-895c0f713861/loki-querier/0.log" Mar 19 20:17:52 crc kubenswrapper[5033]: I0319 20:17:52.851291 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d9vkx_8f4b71de-a42d-4793-a035-a4728876706e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:53 crc kubenswrapper[5033]: I0319 20:17:53.029664 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-gvrd7_cdc791a7-7319-491f-8a1a-bdcd8c333890/init/0.log" Mar 19 20:17:53 crc kubenswrapper[5033]: I0319 20:17:53.234961 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-gvrd7_cdc791a7-7319-491f-8a1a-bdcd8c333890/init/0.log" Mar 19 20:17:53 crc kubenswrapper[5033]: I0319 20:17:53.561462 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-gvrd7_cdc791a7-7319-491f-8a1a-bdcd8c333890/dnsmasq-dns/0.log" Mar 19 20:17:53 crc kubenswrapper[5033]: I0319 20:17:53.640045 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mfddt_02b8790f-5100-462c-972a-ab03fa3e53fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:53 crc kubenswrapper[5033]: I0319 20:17:53.864408 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_94a07e0d-e86b-4f00-9214-b99ff1484630/glance-httpd/0.log" Mar 19 20:17:54 crc kubenswrapper[5033]: I0319 20:17:54.192769 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d033188c-9f49-46fb-8650-b579f9b4a6ea/glance-log/0.log" Mar 19 20:17:54 crc kubenswrapper[5033]: I0319 20:17:54.194400 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_94a07e0d-e86b-4f00-9214-b99ff1484630/glance-log/0.log" Mar 19 20:17:54 crc kubenswrapper[5033]: I0319 20:17:54.195786 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d033188c-9f49-46fb-8650-b579f9b4a6ea/glance-httpd/0.log" Mar 19 20:17:54 crc kubenswrapper[5033]: I0319 20:17:54.670702 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-m7dnn_bc8058af-3032-408e-b907-0eeed9d07109/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:54 crc kubenswrapper[5033]: I0319 20:17:54.961576 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bf66b_047e163c-b9d5-4426-9810-3cdf70a856a4/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:55 crc kubenswrapper[5033]: I0319 20:17:55.186986 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_a1b0222c-e676-4f3c-8930-6926d7824866/cloudkitty-proc/0.log" Mar 19 20:17:55 crc kubenswrapper[5033]: I0319 20:17:55.562047 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565841-b6rqs_0e5cc708-ac76-4f3a-baed-1d2791ce807a/keystone-cron/0.log" Mar 19 20:17:55 crc kubenswrapper[5033]: I0319 20:17:55.589565 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d44c58694-mkj7x_096a6f51-befa-462f-b029-3d4d84e884bf/keystone-api/0.log" Mar 19 20:17:55 crc kubenswrapper[5033]: I0319 20:17:55.737741 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_855aaf56-adc4-45b2-a632-afc1aeb26f79/kube-state-metrics/0.log" Mar 19 20:17:56 crc kubenswrapper[5033]: I0319 20:17:56.394166 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65f99858cc-j489l_00c78e80-6687-41e0-9b1c-57b76358e01f/neutron-api/0.log" Mar 19 20:17:56 crc kubenswrapper[5033]: I0319 20:17:56.555512 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-685cj_197f45b2-0d11-4b18-ac55-d4fb3b29c09e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:56 crc kubenswrapper[5033]: I0319 20:17:56.696762 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65f99858cc-j489l_00c78e80-6687-41e0-9b1c-57b76358e01f/neutron-httpd/0.log" Mar 19 20:17:56 crc kubenswrapper[5033]: I0319 20:17:56.844548 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qmds5_0b7506fc-a863-4730-be08-587061188731/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:57 crc kubenswrapper[5033]: I0319 20:17:57.698252 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9da9faaa-b095-42c9-90d9-d6b0dac1b3c9/nova-cell0-conductor-conductor/0.log" Mar 19 20:17:57 crc kubenswrapper[5033]: I0319 20:17:57.769237 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49ced7d3-72c5-4255-a657-9f14d7f2b656/nova-api-log/0.log" Mar 19 20:17:57 crc kubenswrapper[5033]: I0319 20:17:57.959178 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_49ced7d3-72c5-4255-a657-9f14d7f2b656/nova-api-api/0.log" Mar 19 20:17:58 crc kubenswrapper[5033]: I0319 20:17:58.219245 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5d16067a-8ac2-4949-931e-e874147e40dc/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 20:17:58 crc kubenswrapper[5033]: I0319 20:17:58.253936 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ec25f362-50d7-4ebe-951c-c51868e2485d/nova-cell1-conductor-conductor/0.log" Mar 19 20:17:59 crc kubenswrapper[5033]: I0319 20:17:59.174344 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_24d06bd8-50d5-4da0-9041-41f401c1c4fd/nova-metadata-log/0.log" Mar 19 20:17:59 crc kubenswrapper[5033]: I0319 20:17:59.450459 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-tplwr_44718e74-5ed8-41f2-880d-2b72e10d8cb4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:17:59 crc kubenswrapper[5033]: I0319 20:17:59.653008 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_be847d21-69a1-4299-a333-94d99e6af513/nova-scheduler-scheduler/0.log" Mar 19 20:17:59 crc kubenswrapper[5033]: I0319 20:17:59.756699 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_24d06bd8-50d5-4da0-9041-41f401c1c4fd/nova-metadata-metadata/0.log" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.002415 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85185bfa-1205-4129-8f90-55b580fd3939/mysql-bootstrap/0.log" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.146983 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565858-9cbbx"] Mar 19 20:18:00 crc kubenswrapper[5033]: E0319 20:18:00.147376 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="registry-server" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147391 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="registry-server" Mar 19 20:18:00 crc kubenswrapper[5033]: E0319 20:18:00.147412 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="extract-utilities" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147419 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="extract-utilities" Mar 19 20:18:00 crc kubenswrapper[5033]: E0319 20:18:00.147426 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40af062-a898-4af4-91a3-2b5d09198608" containerName="container-00" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147432 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40af062-a898-4af4-91a3-2b5d09198608" containerName="container-00" Mar 19 20:18:00 crc kubenswrapper[5033]: E0319 20:18:00.147467 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="extract-content" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147473 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="extract-content" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147658 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdb042c-2486-439b-b5a7-499780683fed" containerName="registry-server" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.147674 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40af062-a898-4af4-91a3-2b5d09198608" containerName="container-00" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.148441 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.150586 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.150660 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.150731 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.174515 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-9cbbx"] Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.249631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mx9q\" (UniqueName: \"kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q\") pod \"auto-csr-approver-29565858-9cbbx\" (UID: \"b58931cc-765c-4d48-8a5f-239fee20e2c0\") " pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.356002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mx9q\" (UniqueName: \"kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q\") pod \"auto-csr-approver-29565858-9cbbx\" (UID: \"b58931cc-765c-4d48-8a5f-239fee20e2c0\") " pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.396328 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mx9q\" (UniqueName: \"kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q\") pod \"auto-csr-approver-29565858-9cbbx\" (UID: \"b58931cc-765c-4d48-8a5f-239fee20e2c0\") " pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.464685 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.727489 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ac9db1d-1045-42f9-a7af-1c118226d1d2/mysql-bootstrap/0.log" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.806392 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85185bfa-1205-4129-8f90-55b580fd3939/mysql-bootstrap/0.log" Mar 19 20:18:00 crc kubenswrapper[5033]: I0319 20:18:00.908308 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85185bfa-1205-4129-8f90-55b580fd3939/galera/0.log" Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.229304 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ac9db1d-1045-42f9-a7af-1c118226d1d2/mysql-bootstrap/0.log" Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.270335 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-9cbbx"] Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.273345 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.314022 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ac9db1d-1045-42f9-a7af-1c118226d1d2/galera/0.log" Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.532735 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d14b9715-0fd4-48c5-8531-42c4d60ec6e6/openstackclient/0.log" Mar 19 20:18:01 crc kubenswrapper[5033]: I0319 20:18:01.856787 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8z6ts_ebcc8953-fc35-48d7-a3fd-be1a2291c08c/ovn-controller/0.log" Mar 19 20:18:02 crc kubenswrapper[5033]: I0319 20:18:02.139171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" event={"ID":"b58931cc-765c-4d48-8a5f-239fee20e2c0","Type":"ContainerStarted","Data":"1a5038b880f7671d0ecbbba539362432447dfb0982705164d4ee90ebb0ca3c1b"} Mar 19 20:18:02 crc kubenswrapper[5033]: I0319 20:18:02.530484 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d4vfd_7b48e9b9-7f87-4d91-ad8c-eb50df3b6534/ovsdb-server-init/0.log" Mar 19 20:18:02 crc kubenswrapper[5033]: I0319 20:18:02.689650 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-f4hlq_013523ec-f077-4920-9b16-018f37cf5ef6/openstack-network-exporter/0.log" Mar 19 20:18:02 crc kubenswrapper[5033]: I0319 20:18:02.995065 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d4vfd_7b48e9b9-7f87-4d91-ad8c-eb50df3b6534/ovsdb-server-init/0.log" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.077840 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d4vfd_7b48e9b9-7f87-4d91-ad8c-eb50df3b6534/ovs-vswitchd/0.log" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.151899 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" event={"ID":"b58931cc-765c-4d48-8a5f-239fee20e2c0","Type":"ContainerStarted","Data":"9219efb981eab54a08733e694f05300ee547043fd48807932253175a8fe81cef"} Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.175805 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" podStartSLOduration=2.204280861 podStartE2EDuration="3.175789147s" podCreationTimestamp="2026-03-19 20:18:00 +0000 UTC" firstStartedPulling="2026-03-19 20:18:01.273117163 +0000 UTC m=+4891.378147012" lastFinishedPulling="2026-03-19 20:18:02.244625449 +0000 UTC m=+4892.349655298" observedRunningTime="2026-03-19 20:18:03.171734422 +0000 UTC m=+4893.276764261" watchObservedRunningTime="2026-03-19 20:18:03.175789147 +0000 UTC m=+4893.280818996" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.346257 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d4vfd_7b48e9b9-7f87-4d91-ad8c-eb50df3b6534/ovsdb-server/0.log" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.688252 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-495rs_158eff83-6646-4490-8169-9d2c9c9cd06c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.936862 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_920698c6-4c9d-4e12-bab3-9d7091f02548/openstack-network-exporter/0.log" Mar 19 20:18:03 crc kubenswrapper[5033]: I0319 20:18:03.969638 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_920698c6-4c9d-4e12-bab3-9d7091f02548/ovn-northd/0.log" Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.138312 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_035d8393-f8cc-4c44-b116-245b5e93e70c/openstack-network-exporter/0.log" Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.169332 5033 generic.go:334] "Generic (PLEG): container finished" podID="b58931cc-765c-4d48-8a5f-239fee20e2c0" containerID="9219efb981eab54a08733e694f05300ee547043fd48807932253175a8fe81cef" exitCode=0 Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.169373 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" event={"ID":"b58931cc-765c-4d48-8a5f-239fee20e2c0","Type":"ContainerDied","Data":"9219efb981eab54a08733e694f05300ee547043fd48807932253175a8fe81cef"} Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.476690 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_035d8393-f8cc-4c44-b116-245b5e93e70c/ovsdbserver-nb/0.log" Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.684968 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_314e5a54-3e9c-42ce-807e-f798a2ab66f9/openstack-network-exporter/0.log" Mar 19 20:18:04 crc kubenswrapper[5033]: I0319 20:18:04.740832 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_314e5a54-3e9c-42ce-807e-f798a2ab66f9/ovsdbserver-sb/0.log" Mar 19 20:18:05 crc kubenswrapper[5033]: I0319 20:18:05.128235 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d0456b5-83d8-48b0-84a0-2d4d60604b9b/init-config-reloader/0.log" Mar 19 20:18:05 crc kubenswrapper[5033]: I0319 20:18:05.150067 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66c5bf8b4d-dnhrv_f91a4632-26f5-4e3e-82d7-53b28bd561f5/placement-api/0.log" Mar 19 20:18:05 crc kubenswrapper[5033]: I0319 20:18:05.229930 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-66c5bf8b4d-dnhrv_f91a4632-26f5-4e3e-82d7-53b28bd561f5/placement-log/0.log" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.202353 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d0456b5-83d8-48b0-84a0-2d4d60604b9b/thanos-sidecar/0.log" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.207961 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d0456b5-83d8-48b0-84a0-2d4d60604b9b/prometheus/0.log" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.278066 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d0456b5-83d8-48b0-84a0-2d4d60604b9b/config-reloader/0.log" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.395887 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1d0456b5-83d8-48b0-84a0-2d4d60604b9b/init-config-reloader/0.log" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.467973 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.515372 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mx9q\" (UniqueName: \"kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q\") pod \"b58931cc-765c-4d48-8a5f-239fee20e2c0\" (UID: \"b58931cc-765c-4d48-8a5f-239fee20e2c0\") " Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.528600 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q" (OuterVolumeSpecName: "kube-api-access-4mx9q") pod "b58931cc-765c-4d48-8a5f-239fee20e2c0" (UID: "b58931cc-765c-4d48-8a5f-239fee20e2c0"). InnerVolumeSpecName "kube-api-access-4mx9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.617837 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mx9q\" (UniqueName: \"kubernetes.io/projected/b58931cc-765c-4d48-8a5f-239fee20e2c0-kube-api-access-4mx9q\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.622030 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:18:06 crc kubenswrapper[5033]: E0319 20:18:06.622281 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:18:06 crc kubenswrapper[5033]: I0319 20:18:06.782018 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f8dad24e-7338-41b9-b008-f3dd1c68d3de/setup-container/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.162948 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f8dad24e-7338-41b9-b008-f3dd1c68d3de/setup-container/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.240682 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" event={"ID":"b58931cc-765c-4d48-8a5f-239fee20e2c0","Type":"ContainerDied","Data":"1a5038b880f7671d0ecbbba539362432447dfb0982705164d4ee90ebb0ca3c1b"} Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.240720 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5038b880f7671d0ecbbba539362432447dfb0982705164d4ee90ebb0ca3c1b" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.240794 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-9cbbx" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.300570 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f8dad24e-7338-41b9-b008-f3dd1c68d3de/rabbitmq/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.342794 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c343ed29-14b7-4363-a055-7b540ee2ea31/setup-container/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.555547 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-rbwmt"] Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.568513 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-rbwmt"] Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.769096 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ksrs7_5db4f1c0-87be-4f27-8a76-6f2a3deb4158/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.795746 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c343ed29-14b7-4363-a055-7b540ee2ea31/setup-container/0.log" Mar 19 20:18:07 crc kubenswrapper[5033]: I0319 20:18:07.853645 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c343ed29-14b7-4363-a055-7b540ee2ea31/rabbitmq/0.log" Mar 19 20:18:08 crc kubenswrapper[5033]: I0319 20:18:08.131960 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-x4k79_f166f58a-6a51-48ba-ae2e-aa90d8a656dc/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:08 crc kubenswrapper[5033]: I0319 20:18:08.263428 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cnm5f_473215a7-171f-46db-ad01-632b59a1eb95/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:08 crc kubenswrapper[5033]: I0319 20:18:08.635694 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4606ed7c-b667-43a1-93e8-f145b487594a" path="/var/lib/kubelet/pods/4606ed7c-b667-43a1-93e8-f145b487594a/volumes" Mar 19 20:18:08 crc kubenswrapper[5033]: I0319 20:18:08.706903 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4jwkh_f24a4d98-fb0f-4bcd-905f-4bfce712a950/ssh-known-hosts-edpm-deployment/0.log" Mar 19 20:18:08 crc kubenswrapper[5033]: I0319 20:18:08.867774 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-f2rml_94576899-d90e-4b04-ae41-06c47f2a4383/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:09 crc kubenswrapper[5033]: I0319 20:18:09.023631 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_862eb5fe-aecf-465c-a30b-5f9c0477d625/memcached/0.log" Mar 19 20:18:09 crc kubenswrapper[5033]: I0319 20:18:09.906874 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d74595b67-9drnj_66364c56-ef15-437e-8508-2f7b2c4471f8/proxy-httpd/0.log" Mar 19 20:18:09 crc kubenswrapper[5033]: I0319 20:18:09.980914 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d74595b67-9drnj_66364c56-ef15-437e-8508-2f7b2c4471f8/proxy-server/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.026354 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bz7gc_66d432b8-f84d-4565-96a7-7232024ffe4b/swift-ring-rebalance/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.306160 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/account-auditor/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.460621 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/account-server/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.463361 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/account-reaper/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.464118 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/account-replicator/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.717761 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/container-auditor/0.log" Mar 19 20:18:10 crc kubenswrapper[5033]: I0319 20:18:10.860903 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/container-replicator/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.155275 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/object-auditor/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.201349 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/container-server/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.288365 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/object-expirer/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.353356 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/container-updater/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.469572 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/object-replicator/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.536014 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/object-server/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.550241 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/object-updater/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.724528 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/rsync/0.log" Mar 19 20:18:11 crc kubenswrapper[5033]: I0319 20:18:11.821840 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a91fda80-4324-4015-a32f-3396d6d2da1d/swift-recon-cron/0.log" Mar 19 20:18:12 crc kubenswrapper[5033]: I0319 20:18:12.085154 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-s582f_57028763-d499-4e66-aaa7-52bbf97174d3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:12 crc kubenswrapper[5033]: I0319 20:18:12.141654 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_190e5876-b872-4c45-a860-696c0e739f2b/tempest-tests-tempest-tests-runner/0.log" Mar 19 20:18:12 crc kubenswrapper[5033]: I0319 20:18:12.353845 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c00f2e97-0513-4dd0-ad6d-851e50ba920f/test-operator-logs-container/0.log" Mar 19 20:18:12 crc kubenswrapper[5033]: I0319 20:18:12.502408 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-khxk7_2323e870-76a4-4404-aed7-0c40ee7dd4d2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:18:20 crc kubenswrapper[5033]: I0319 20:18:20.657878 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:18:20 crc kubenswrapper[5033]: E0319 20:18:20.658848 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:18:33 crc kubenswrapper[5033]: I0319 20:18:33.620922 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:18:33 crc kubenswrapper[5033]: E0319 20:18:33.621742 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:18:42 crc kubenswrapper[5033]: I0319 20:18:42.287627 5033 scope.go:117] "RemoveContainer" containerID="99f480714ebd2ae3e4b0405adf6771e07bcce962fd477a5c7c5b15c2c23869d4" Mar 19 20:18:48 crc kubenswrapper[5033]: I0319 20:18:48.620066 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:18:49 crc kubenswrapper[5033]: I0319 20:18:49.634825 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20"} Mar 19 20:19:08 crc kubenswrapper[5033]: I0319 20:19:08.640039 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/util/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.061494 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/pull/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.104890 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/pull/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.172062 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/util/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.477335 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/util/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.520564 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/extract/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.649861 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908bgh7r6_78c4876a-af26-49e2-95cf-232f2673a934/pull/0.log" Mar 19 20:19:09 crc kubenswrapper[5033]: I0319 20:19:09.878465 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-zjvmm_959b54c4-e249-46da-a57f-e997e6944147/manager/0.log" Mar 19 20:19:10 crc kubenswrapper[5033]: I0319 20:19:10.310380 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-5rrzc_e7453f1e-da50-4386-ac0d-64d309e192b0/manager/0.log" Mar 19 20:19:10 crc kubenswrapper[5033]: I0319 20:19:10.607522 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-n8ssn_d565f597-d2c9-45cf-84fd-81986897c7ec/manager/0.log" Mar 19 20:19:10 crc kubenswrapper[5033]: I0319 20:19:10.739817 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-h4rn2_344bdd4e-7b37-4bd7-b403-58a5aa242946/manager/0.log" Mar 19 20:19:11 crc kubenswrapper[5033]: I0319 20:19:11.123542 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-xtxhs_58ad3c6c-eec2-41dd-98ac-0ff2454ba608/manager/0.log" Mar 19 20:19:11 crc kubenswrapper[5033]: I0319 20:19:11.575736 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-t7sd4_ab2a955c-f6ac-4ecc-92ad-ecf8bcb1ed74/manager/0.log" Mar 19 20:19:11 crc kubenswrapper[5033]: I0319 20:19:11.812853 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-vfg9k_b5102fa8-cc34-4f68-a4af-f26a243c3238/manager/0.log" Mar 19 20:19:11 crc kubenswrapper[5033]: I0319 20:19:11.882730 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-lf4wr_80d312b1-fa3e-4746-baaf-1aa74b1a6a46/manager/0.log" Mar 19 20:19:12 crc kubenswrapper[5033]: I0319 20:19:12.262247 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dfthw_b34dd812-d36d-4bfb-93df-03caacef3d64/manager/0.log" Mar 19 20:19:12 crc kubenswrapper[5033]: I0319 20:19:12.267770 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-j8lbc_22389b01-1060-4ddb-8fc3-3d3faeafefd2/manager/0.log" Mar 19 20:19:12 crc kubenswrapper[5033]: I0319 20:19:12.825128 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-4nfws_4e5bdc34-6776-4463-badb-8666194c5f89/manager/0.log" Mar 19 20:19:12 crc kubenswrapper[5033]: I0319 20:19:12.854425 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-r8r84_98807d9f-1747-4534-9727-57a6d81775b6/manager/0.log" Mar 19 20:19:13 crc kubenswrapper[5033]: I0319 20:19:13.136872 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-n9lpt_c8a93f31-4443-4247-b4de-d0bc1e26c6f8/manager/0.log" Mar 19 20:19:13 crc kubenswrapper[5033]: I0319 20:19:13.297730 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-hn4dm_336483e6-616b-4489-bd29-5f7b62ef0d45/manager/0.log" Mar 19 20:19:13 crc kubenswrapper[5033]: I0319 20:19:13.651653 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-58kzh_716e7850-0e5e-4cd7-8de4-1b3b6bd51a16/manager/0.log" Mar 19 20:19:13 crc kubenswrapper[5033]: I0319 20:19:13.903767 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6c6f68556d-szp9l_ea548f00-7f7c-4716-acb1-8bf41a3a9e9e/operator/0.log" Mar 19 20:19:14 crc kubenswrapper[5033]: I0319 20:19:14.090738 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nhm8m_9b05fede-26cf-4467-9222-df3f6af81c0a/registry-server/0.log" Mar 19 20:19:14 crc kubenswrapper[5033]: I0319 20:19:14.596783 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-dgp4k_1d6c9166-95cd-42d6-9782-56e8879c0412/manager/0.log" Mar 19 20:19:14 crc kubenswrapper[5033]: I0319 20:19:14.695514 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-w2pfb_f160f298-8509-46b1-865f-7241c0dd299f/manager/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.030762 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-zczlg_ae6d87d9-9ab4-4a9c-84f4-e5913246875e/manager/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.083954 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-646cd56bc9-j4xnq_6a3c6b85-334f-431c-8840-ca3cf37451a9/manager/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.104486 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4cn7g_668a0189-af32-4674-8c48-101dac5c1e55/operator/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.593357 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8ggqw_fe433b1b-c379-493b-8e5a-74dff21a208d/manager/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.701603 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c5c766d94-gvt8l_2f3d3208-9746-409f-95e9-7ada3c61671d/manager/0.log" Mar 19 20:19:15 crc kubenswrapper[5033]: I0319 20:19:15.886535 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-87zz5_16cc55d4-44b6-4e27-9afd-484d4db42d1a/manager/0.log" Mar 19 20:19:53 crc kubenswrapper[5033]: I0319 20:19:53.237121 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-75f9n_b71787f5-93bf-434a-b67d-63be189d843e/control-plane-machine-set-operator/0.log" Mar 19 20:19:53 crc kubenswrapper[5033]: I0319 20:19:53.642120 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jwxnr_ca47e31e-6c9f-471d-86ca-58ea515fc112/machine-api-operator/0.log" Mar 19 20:19:53 crc kubenswrapper[5033]: I0319 20:19:53.807006 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jwxnr_ca47e31e-6c9f-471d-86ca-58ea515fc112/kube-rbac-proxy/0.log" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.211519 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565860-tz5nj"] Mar 19 20:20:00 crc kubenswrapper[5033]: E0319 20:20:00.212529 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58931cc-765c-4d48-8a5f-239fee20e2c0" containerName="oc" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.212542 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58931cc-765c-4d48-8a5f-239fee20e2c0" containerName="oc" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.212775 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58931cc-765c-4d48-8a5f-239fee20e2c0" containerName="oc" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.213559 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.222513 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.222826 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.223198 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.240487 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-tz5nj"] Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.285927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mtt\" (UniqueName: \"kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt\") pod \"auto-csr-approver-29565860-tz5nj\" (UID: \"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a\") " pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.387535 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mtt\" (UniqueName: \"kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt\") pod \"auto-csr-approver-29565860-tz5nj\" (UID: \"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a\") " pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.405903 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mtt\" (UniqueName: \"kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt\") pod \"auto-csr-approver-29565860-tz5nj\" (UID: \"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a\") " pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:00 crc kubenswrapper[5033]: I0319 20:20:00.548916 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:01 crc kubenswrapper[5033]: I0319 20:20:01.370163 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-tz5nj"] Mar 19 20:20:02 crc kubenswrapper[5033]: I0319 20:20:02.369710 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" event={"ID":"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a","Type":"ContainerStarted","Data":"64785f9719a4b27fc9ef44d140a54dd18b5e72995dd18e6bf0e499d295c83bdd"} Mar 19 20:20:04 crc kubenswrapper[5033]: I0319 20:20:04.387560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" event={"ID":"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a","Type":"ContainerStarted","Data":"d8e0976a0bd2b50e40009777579e40b5c01e00acf3e6ae39fb752835c5aedb46"} Mar 19 20:20:04 crc kubenswrapper[5033]: I0319 20:20:04.406870 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" podStartSLOduration=2.660122838 podStartE2EDuration="4.40685254s" podCreationTimestamp="2026-03-19 20:20:00 +0000 UTC" firstStartedPulling="2026-03-19 20:20:01.376835104 +0000 UTC m=+5011.481864953" lastFinishedPulling="2026-03-19 20:20:03.123564806 +0000 UTC m=+5013.228594655" observedRunningTime="2026-03-19 20:20:04.398383811 +0000 UTC m=+5014.503413660" watchObservedRunningTime="2026-03-19 20:20:04.40685254 +0000 UTC m=+5014.511882389" Mar 19 20:20:05 crc kubenswrapper[5033]: I0319 20:20:05.396738 5033 generic.go:334] "Generic (PLEG): container finished" podID="d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" containerID="d8e0976a0bd2b50e40009777579e40b5c01e00acf3e6ae39fb752835c5aedb46" exitCode=0 Mar 19 20:20:05 crc kubenswrapper[5033]: I0319 20:20:05.396900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" event={"ID":"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a","Type":"ContainerDied","Data":"d8e0976a0bd2b50e40009777579e40b5c01e00acf3e6ae39fb752835c5aedb46"} Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.422030 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" event={"ID":"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a","Type":"ContainerDied","Data":"64785f9719a4b27fc9ef44d140a54dd18b5e72995dd18e6bf0e499d295c83bdd"} Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.422555 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64785f9719a4b27fc9ef44d140a54dd18b5e72995dd18e6bf0e499d295c83bdd" Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.490971 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.639153 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8mtt\" (UniqueName: \"kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt\") pod \"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a\" (UID: \"d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a\") " Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.645914 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt" (OuterVolumeSpecName: "kube-api-access-s8mtt") pod "d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" (UID: "d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a"). InnerVolumeSpecName "kube-api-access-s8mtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:20:07 crc kubenswrapper[5033]: I0319 20:20:07.741576 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8mtt\" (UniqueName: \"kubernetes.io/projected/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a-kube-api-access-s8mtt\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:08 crc kubenswrapper[5033]: I0319 20:20:08.429124 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-tz5nj" Mar 19 20:20:08 crc kubenswrapper[5033]: I0319 20:20:08.554982 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-xlsjz"] Mar 19 20:20:08 crc kubenswrapper[5033]: I0319 20:20:08.569402 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-xlsjz"] Mar 19 20:20:08 crc kubenswrapper[5033]: I0319 20:20:08.631573 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31ccec7-6d67-4152-8128-afca55385d33" path="/var/lib/kubelet/pods/e31ccec7-6d67-4152-8128-afca55385d33/volumes" Mar 19 20:20:20 crc kubenswrapper[5033]: I0319 20:20:20.079122 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tqbxh_4dd5d60d-d4a3-4c4a-a194-b5c39e7bc4fd/cert-manager-controller/0.log" Mar 19 20:20:20 crc kubenswrapper[5033]: I0319 20:20:20.459679 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-hggdg_9a693b77-c591-4899-9aba-6f674eac5601/cert-manager-cainjector/0.log" Mar 19 20:20:20 crc kubenswrapper[5033]: I0319 20:20:20.572497 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-9tjpj_13744f00-1ee4-48fb-a839-d8bb7e4d7a7b/cert-manager-webhook/0.log" Mar 19 20:20:42 crc kubenswrapper[5033]: I0319 20:20:42.405301 5033 scope.go:117] "RemoveContainer" containerID="41c8d2dd39d728831efe5582486639e477e74bd5d1c414a0cba1fb6f3be69a74" Mar 19 20:20:45 crc kubenswrapper[5033]: I0319 20:20:45.487005 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-vvqkr_b45e3fdc-a192-4b4c-830c-bfb94179eed7/nmstate-console-plugin/0.log" Mar 19 20:20:45 crc kubenswrapper[5033]: I0319 20:20:45.850376 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2z9z2_6687aa22-fc81-4b3b-a810-29da61fff408/nmstate-handler/0.log" Mar 19 20:20:46 crc kubenswrapper[5033]: I0319 20:20:46.040810 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-vpctf_06756e34-c004-4dc7-85ca-75b9c0b8ea15/kube-rbac-proxy/0.log" Mar 19 20:20:46 crc kubenswrapper[5033]: I0319 20:20:46.183171 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-vpctf_06756e34-c004-4dc7-85ca-75b9c0b8ea15/nmstate-metrics/0.log" Mar 19 20:20:46 crc kubenswrapper[5033]: I0319 20:20:46.363550 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-d97f6_51881d08-f38e-4817-b7d6-942913bff182/nmstate-operator/0.log" Mar 19 20:20:46 crc kubenswrapper[5033]: I0319 20:20:46.551151 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7fsr6_cacfaf75-f7b6-4ca4-ad40-661224a27fad/nmstate-webhook/0.log" Mar 19 20:21:10 crc kubenswrapper[5033]: I0319 20:21:10.758268 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:21:10 crc kubenswrapper[5033]: I0319 20:21:10.758857 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:21:11 crc kubenswrapper[5033]: I0319 20:21:11.482051 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-767b88fbc9-6bv6n_25d27288-ba82-4c74-a864-b5e54e4be246/kube-rbac-proxy/0.log" Mar 19 20:21:11 crc kubenswrapper[5033]: I0319 20:21:11.513023 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-767b88fbc9-6bv6n_25d27288-ba82-4c74-a864-b5e54e4be246/manager/0.log" Mar 19 20:21:37 crc kubenswrapper[5033]: I0319 20:21:37.164033 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-hr4xj_2f90a5c2-7618-4090-b63c-6d40664ab26e/prometheus-operator/0.log" Mar 19 20:21:37 crc kubenswrapper[5033]: I0319 20:21:37.666172 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj_88b04cc7-0103-4c0a-bd35-421e81888064/prometheus-operator-admission-webhook/0.log" Mar 19 20:21:37 crc kubenswrapper[5033]: I0319 20:21:37.761537 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5_0bff649f-6d36-42eb-8419-aebbe076b40c/prometheus-operator-admission-webhook/0.log" Mar 19 20:21:37 crc kubenswrapper[5033]: I0319 20:21:37.886407 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-v94xx_92efd1e2-4b4b-48b0-a991-1c3cfe62eef3/operator/0.log" Mar 19 20:21:38 crc kubenswrapper[5033]: I0319 20:21:38.183278 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-558f99686-58c2z_5f1646e5-4c53-4c2f-93f0-b32741aa44ae/perses-operator/0.log" Mar 19 20:21:40 crc kubenswrapper[5033]: I0319 20:21:40.759172 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:21:40 crc kubenswrapper[5033]: I0319 20:21:40.759704 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.139628 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565862-jd6m9"] Mar 19 20:22:00 crc kubenswrapper[5033]: E0319 20:22:00.140606 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.140619 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.140815 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.141503 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.144975 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.145065 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.148122 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.155128 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-jd6m9"] Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.246038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm65k\" (UniqueName: \"kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k\") pod \"auto-csr-approver-29565862-jd6m9\" (UID: \"5781f381-0321-47f5-a74c-bd0331daffd9\") " pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.355330 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm65k\" (UniqueName: \"kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k\") pod \"auto-csr-approver-29565862-jd6m9\" (UID: \"5781f381-0321-47f5-a74c-bd0331daffd9\") " pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.377235 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm65k\" (UniqueName: \"kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k\") pod \"auto-csr-approver-29565862-jd6m9\" (UID: \"5781f381-0321-47f5-a74c-bd0331daffd9\") " pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:00 crc kubenswrapper[5033]: I0319 20:22:00.475629 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:01 crc kubenswrapper[5033]: I0319 20:22:01.835836 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-jd6m9"] Mar 19 20:22:02 crc kubenswrapper[5033]: I0319 20:22:02.443958 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" event={"ID":"5781f381-0321-47f5-a74c-bd0331daffd9","Type":"ContainerStarted","Data":"2a22f8830ec1f41815496dff7c7271b866de03dbbc0b1b7c920623ca8c5613e4"} Mar 19 20:22:03 crc kubenswrapper[5033]: I0319 20:22:03.453118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" event={"ID":"5781f381-0321-47f5-a74c-bd0331daffd9","Type":"ContainerStarted","Data":"7e61c18abce6cc0dac1834bd702fda35fa76b6e2a06f3515d98c4bffb602e93f"} Mar 19 20:22:03 crc kubenswrapper[5033]: I0319 20:22:03.467079 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" podStartSLOduration=2.659264359 podStartE2EDuration="3.467059875s" podCreationTimestamp="2026-03-19 20:22:00 +0000 UTC" firstStartedPulling="2026-03-19 20:22:01.837810928 +0000 UTC m=+5131.942840777" lastFinishedPulling="2026-03-19 20:22:02.645606444 +0000 UTC m=+5132.750636293" observedRunningTime="2026-03-19 20:22:03.464313808 +0000 UTC m=+5133.569343657" watchObservedRunningTime="2026-03-19 20:22:03.467059875 +0000 UTC m=+5133.572089724" Mar 19 20:22:04 crc kubenswrapper[5033]: I0319 20:22:04.464868 5033 generic.go:334] "Generic (PLEG): container finished" podID="5781f381-0321-47f5-a74c-bd0331daffd9" containerID="7e61c18abce6cc0dac1834bd702fda35fa76b6e2a06f3515d98c4bffb602e93f" exitCode=0 Mar 19 20:22:04 crc kubenswrapper[5033]: I0319 20:22:04.465083 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" event={"ID":"5781f381-0321-47f5-a74c-bd0331daffd9","Type":"ContainerDied","Data":"7e61c18abce6cc0dac1834bd702fda35fa76b6e2a06f3515d98c4bffb602e93f"} Mar 19 20:22:06 crc kubenswrapper[5033]: I0319 20:22:06.843645 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.043108 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm65k\" (UniqueName: \"kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k\") pod \"5781f381-0321-47f5-a74c-bd0331daffd9\" (UID: \"5781f381-0321-47f5-a74c-bd0331daffd9\") " Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.062323 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k" (OuterVolumeSpecName: "kube-api-access-fm65k") pod "5781f381-0321-47f5-a74c-bd0331daffd9" (UID: "5781f381-0321-47f5-a74c-bd0331daffd9"). InnerVolumeSpecName "kube-api-access-fm65k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.132965 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-dwtb7_cb4013d3-e95f-4c07-803c-fef1db499d5f/controller/0.log" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.148835 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm65k\" (UniqueName: \"kubernetes.io/projected/5781f381-0321-47f5-a74c-bd0331daffd9-kube-api-access-fm65k\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.159660 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-dwtb7_cb4013d3-e95f-4c07-803c-fef1db499d5f/kube-rbac-proxy/0.log" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.490832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" event={"ID":"5781f381-0321-47f5-a74c-bd0331daffd9","Type":"ContainerDied","Data":"2a22f8830ec1f41815496dff7c7271b866de03dbbc0b1b7c920623ca8c5613e4"} Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.491117 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a22f8830ec1f41815496dff7c7271b866de03dbbc0b1b7c920623ca8c5613e4" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.491169 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-jd6m9" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.579403 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-frr-files/0.log" Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.911535 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-xxmqn"] Mar 19 20:22:07 crc kubenswrapper[5033]: I0319 20:22:07.921070 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-xxmqn"] Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.033157 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-reloader/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.112659 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-frr-files/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.211776 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-reloader/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.215535 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-metrics/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.440776 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-reloader/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.546224 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-frr-files/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.631115 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c9821a-4188-4e83-87f2-3ce4c0158d02" path="/var/lib/kubelet/pods/32c9821a-4188-4e83-87f2-3ce4c0158d02/volumes" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.637484 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-metrics/0.log" Mar 19 20:22:08 crc kubenswrapper[5033]: I0319 20:22:08.685927 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-metrics/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.126677 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-reloader/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.161815 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/controller/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.184517 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-metrics/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.427331 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/cp-frr-files/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.738210 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/frr-metrics/0.log" Mar 19 20:22:09 crc kubenswrapper[5033]: I0319 20:22:09.754964 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/kube-rbac-proxy/0.log" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.098528 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/reloader/0.log" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.168888 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/kube-rbac-proxy-frr/0.log" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.405600 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-8tmvb_a8f17329-5c04-45bf-9246-fa2aec1e793f/frr-k8s-webhook-server/0.log" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.693393 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5db5b6db5f-r8dbz_7c3f2160-7476-4b83-aa92-cb6064f90495/manager/0.log" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.758215 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.758278 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.758326 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.759160 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:22:10 crc kubenswrapper[5033]: I0319 20:22:10.759221 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20" gracePeriod=600 Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.004656 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fdfd57896-smckp_edc8556f-ba2e-4280-bab3-5778367b0c75/webhook-server/0.log" Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.198243 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bn5m8_f1995507-c619-4057-921c-8ca218c4f82e/kube-rbac-proxy/0.log" Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.425078 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pdvpg_e02d5e27-7333-463b-9144-9fa72da9592c/frr/0.log" Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.552053 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20" exitCode=0 Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.552094 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20"} Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.552118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646"} Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.552134 5033 scope.go:117] "RemoveContainer" containerID="0d1af855cbe225433871791d27aa9411a4a8ee9fb0cf916671c0509f99be4bbe" Mar 19 20:22:11 crc kubenswrapper[5033]: I0319 20:22:11.707255 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bn5m8_f1995507-c619-4057-921c-8ca218c4f82e/speaker/0.log" Mar 19 20:22:38 crc kubenswrapper[5033]: I0319 20:22:38.963145 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/util/0.log" Mar 19 20:22:39 crc kubenswrapper[5033]: I0319 20:22:39.911471 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/util/0.log" Mar 19 20:22:39 crc kubenswrapper[5033]: I0319 20:22:39.926217 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/pull/0.log" Mar 19 20:22:39 crc kubenswrapper[5033]: I0319 20:22:39.958154 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/pull/0.log" Mar 19 20:22:40 crc kubenswrapper[5033]: I0319 20:22:40.323209 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/extract/0.log" Mar 19 20:22:40 crc kubenswrapper[5033]: I0319 20:22:40.426594 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/util/0.log" Mar 19 20:22:40 crc kubenswrapper[5033]: I0319 20:22:40.459025 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748gmsn_8e1f452b-54aa-4321-99d5-ebdf6991379a/pull/0.log" Mar 19 20:22:40 crc kubenswrapper[5033]: I0319 20:22:40.772733 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/util/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.189620 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/util/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.192413 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/pull/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.237901 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/pull/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.495803 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/pull/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.560921 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/util/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.585663 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xxtkf_d25c425c-248a-4677-be0a-91a0cd1ccf41/extract/0.log" Mar 19 20:22:41 crc kubenswrapper[5033]: I0319 20:22:41.849570 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/util/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.343262 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/util/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.366894 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/pull/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.449442 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/pull/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.504541 5033 scope.go:117] "RemoveContainer" containerID="e480109e86113cf34e7797564dd40d30e79d579cf38467bc53ec2c3879f6daa3" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.815003 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/extract/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.831170 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/util/0.log" Mar 19 20:22:42 crc kubenswrapper[5033]: I0319 20:22:42.887357 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726kc2m8_f7526813-a7dc-4074-a7ea-f791760e3cb0/pull/0.log" Mar 19 20:22:43 crc kubenswrapper[5033]: I0319 20:22:43.313109 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/util/0.log" Mar 19 20:22:43 crc kubenswrapper[5033]: I0319 20:22:43.778773 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/pull/0.log" Mar 19 20:22:43 crc kubenswrapper[5033]: I0319 20:22:43.792840 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/util/0.log" Mar 19 20:22:43 crc kubenswrapper[5033]: I0319 20:22:43.840064 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/pull/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.137652 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/util/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.179726 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/extract/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.221269 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcst6rb_3470cf1f-22c6-4e0b-b298-7500f269fba3/pull/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.358991 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-utilities/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.540140 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-content/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.602688 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-content/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.646094 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-utilities/0.log" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.941373 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:22:44 crc kubenswrapper[5033]: E0319 20:22:44.942338 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5781f381-0321-47f5-a74c-bd0331daffd9" containerName="oc" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.942362 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5781f381-0321-47f5-a74c-bd0331daffd9" containerName="oc" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.942571 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5781f381-0321-47f5-a74c-bd0331daffd9" containerName="oc" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.944101 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:44 crc kubenswrapper[5033]: I0319 20:22:44.963744 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.003736 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-utilities/0.log" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.027027 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/extract-content/0.log" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.052644 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-utilities/0.log" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.103321 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.103366 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqc7\" (UniqueName: \"kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.103659 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.205102 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.205245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.205286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqc7\" (UniqueName: \"kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.206021 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.206015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.228365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqc7\" (UniqueName: \"kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7\") pod \"community-operators-rgqzm\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.261425 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.401222 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4bw7v_76b35dca-6939-425e-80ce-4f8801214a28/registry-server/0.log" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.829800 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-content/0.log" Mar 19 20:22:45 crc kubenswrapper[5033]: I0319 20:22:45.945489 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-content/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.013360 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-utilities/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.124596 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.288417 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-utilities/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.321702 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/extract-content/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.567975 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fkdb8_fabc7b19-9ff5-4323-a2b5-9b8b4ecd7152/registry-server/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.594171 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xpkw6_3a73e1d1-a896-4aa7-bba9-c372ab716534/marketplace-operator/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.785629 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-utilities/0.log" Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.926912 5033 generic.go:334] "Generic (PLEG): container finished" podID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerID="c10cf197b625dad141171c12b57401e6aa5ec1dace95c2874e1d04067d01cdd6" exitCode=0 Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.926960 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerDied","Data":"c10cf197b625dad141171c12b57401e6aa5ec1dace95c2874e1d04067d01cdd6"} Mar 19 20:22:46 crc kubenswrapper[5033]: I0319 20:22:46.926987 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerStarted","Data":"a0f09c907ae7c93bb858223e6bdb3a46db782be23a47d24a5fa45a7696056228"} Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.544149 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.546773 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.580760 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.597633 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-content/0.log" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.644697 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-utilities/0.log" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.661838 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.662041 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878wj\" (UniqueName: \"kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.662165 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.681788 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-content/0.log" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.764846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.765365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.765431 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878wj\" (UniqueName: \"kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.772926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.773391 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.794675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878wj\" (UniqueName: \"kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj\") pod \"redhat-operators-zn2mr\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:47 crc kubenswrapper[5033]: I0319 20:22:47.863194 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.068116 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-content/0.log" Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.285635 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/extract-utilities/0.log" Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.311767 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-utilities/0.log" Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.353246 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mdzw6_64d10e25-ab07-42ba-90c0-4b57737633f7/registry-server/0.log" Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.726928 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.948552 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerStarted","Data":"32f63e2d631e22a29cb5e9a0f805b4f1768663cf02dc917a320563d97dd96a44"} Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.950351 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerStarted","Data":"31915f57bb0cab8381780b84c4aceb3c28a3c8c1b03b5546d12888d1dcd0d4a6"} Mar 19 20:22:48 crc kubenswrapper[5033]: I0319 20:22:48.950385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerStarted","Data":"f961037011438b3967c4c5e18709c062f64c1a982438b41ddf3a5f1d641e04f2"} Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.045982 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-content/0.log" Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.182971 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-content/0.log" Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.239702 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-utilities/0.log" Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.664926 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-utilities/0.log" Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.967303 5033 generic.go:334] "Generic (PLEG): container finished" podID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerID="32f63e2d631e22a29cb5e9a0f805b4f1768663cf02dc917a320563d97dd96a44" exitCode=0 Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.967643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerDied","Data":"32f63e2d631e22a29cb5e9a0f805b4f1768663cf02dc917a320563d97dd96a44"} Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.970722 5033 generic.go:334] "Generic (PLEG): container finished" podID="452a9295-a6f3-4d20-8315-882690326d72" containerID="31915f57bb0cab8381780b84c4aceb3c28a3c8c1b03b5546d12888d1dcd0d4a6" exitCode=0 Mar 19 20:22:49 crc kubenswrapper[5033]: I0319 20:22:49.970748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerDied","Data":"31915f57bb0cab8381780b84c4aceb3c28a3c8c1b03b5546d12888d1dcd0d4a6"} Mar 19 20:22:50 crc kubenswrapper[5033]: I0319 20:22:50.161783 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/extract-content/0.log" Mar 19 20:22:50 crc kubenswrapper[5033]: I0319 20:22:50.456531 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b57dx_1686c372-2322-47b2-a8a9-ad674bd5bf0b/registry-server/0.log" Mar 19 20:22:50 crc kubenswrapper[5033]: I0319 20:22:50.985213 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerStarted","Data":"775ddbed552a5d242f7cd2ea2044d673660edbbd399819b7d6eee74500d18745"} Mar 19 20:22:51 crc kubenswrapper[5033]: I0319 20:22:51.009878 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rgqzm" podStartSLOduration=3.466282716 podStartE2EDuration="7.009863806s" podCreationTimestamp="2026-03-19 20:22:44 +0000 UTC" firstStartedPulling="2026-03-19 20:22:46.930750735 +0000 UTC m=+5177.035780584" lastFinishedPulling="2026-03-19 20:22:50.474331825 +0000 UTC m=+5180.579361674" observedRunningTime="2026-03-19 20:22:51.009686531 +0000 UTC m=+5181.114716380" watchObservedRunningTime="2026-03-19 20:22:51.009863806 +0000 UTC m=+5181.114893655" Mar 19 20:22:52 crc kubenswrapper[5033]: I0319 20:22:52.000917 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerStarted","Data":"edf7d5682adead4c9df8d4062d19d04b6e78f176ea013e3be7fb2c718cdcc589"} Mar 19 20:22:55 crc kubenswrapper[5033]: I0319 20:22:55.261755 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:55 crc kubenswrapper[5033]: I0319 20:22:55.262302 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:55 crc kubenswrapper[5033]: I0319 20:22:55.308007 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:56 crc kubenswrapper[5033]: I0319 20:22:56.319192 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:22:57 crc kubenswrapper[5033]: I0319 20:22:57.528994 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:22:58 crc kubenswrapper[5033]: I0319 20:22:58.057395 5033 generic.go:334] "Generic (PLEG): container finished" podID="452a9295-a6f3-4d20-8315-882690326d72" containerID="edf7d5682adead4c9df8d4062d19d04b6e78f176ea013e3be7fb2c718cdcc589" exitCode=0 Mar 19 20:22:58 crc kubenswrapper[5033]: I0319 20:22:58.057498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerDied","Data":"edf7d5682adead4c9df8d4062d19d04b6e78f176ea013e3be7fb2c718cdcc589"} Mar 19 20:22:58 crc kubenswrapper[5033]: I0319 20:22:58.057826 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rgqzm" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="registry-server" containerID="cri-o://775ddbed552a5d242f7cd2ea2044d673660edbbd399819b7d6eee74500d18745" gracePeriod=2 Mar 19 20:22:59 crc kubenswrapper[5033]: I0319 20:22:59.076126 5033 generic.go:334] "Generic (PLEG): container finished" podID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerID="775ddbed552a5d242f7cd2ea2044d673660edbbd399819b7d6eee74500d18745" exitCode=0 Mar 19 20:22:59 crc kubenswrapper[5033]: I0319 20:22:59.076518 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerDied","Data":"775ddbed552a5d242f7cd2ea2044d673660edbbd399819b7d6eee74500d18745"} Mar 19 20:22:59 crc kubenswrapper[5033]: I0319 20:22:59.096275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerStarted","Data":"1a20f8cefac21dd9757779ccefb04b2424644c1cd05f2d67fd9a93b348599f44"} Mar 19 20:22:59 crc kubenswrapper[5033]: I0319 20:22:59.123087 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn2mr" podStartSLOduration=3.309425764 podStartE2EDuration="12.123053s" podCreationTimestamp="2026-03-19 20:22:47 +0000 UTC" firstStartedPulling="2026-03-19 20:22:49.972017948 +0000 UTC m=+5180.077047797" lastFinishedPulling="2026-03-19 20:22:58.785645184 +0000 UTC m=+5188.890675033" observedRunningTime="2026-03-19 20:22:59.122397722 +0000 UTC m=+5189.227427571" watchObservedRunningTime="2026-03-19 20:22:59.123053 +0000 UTC m=+5189.228082849" Mar 19 20:22:59 crc kubenswrapper[5033]: I0319 20:22:59.959713 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.044288 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rqc7\" (UniqueName: \"kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7\") pod \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.044763 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content\") pod \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.045188 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities\") pod \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\" (UID: \"a0e6863d-c4df-42a0-8c04-6912e8583e5b\") " Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.046330 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities" (OuterVolumeSpecName: "utilities") pod "a0e6863d-c4df-42a0-8c04-6912e8583e5b" (UID: "a0e6863d-c4df-42a0-8c04-6912e8583e5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.070669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7" (OuterVolumeSpecName: "kube-api-access-8rqc7") pod "a0e6863d-c4df-42a0-8c04-6912e8583e5b" (UID: "a0e6863d-c4df-42a0-8c04-6912e8583e5b"). InnerVolumeSpecName "kube-api-access-8rqc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.133665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgqzm" event={"ID":"a0e6863d-c4df-42a0-8c04-6912e8583e5b","Type":"ContainerDied","Data":"a0f09c907ae7c93bb858223e6bdb3a46db782be23a47d24a5fa45a7696056228"} Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.133732 5033 scope.go:117] "RemoveContainer" containerID="775ddbed552a5d242f7cd2ea2044d673660edbbd399819b7d6eee74500d18745" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.133897 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgqzm" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.148321 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0e6863d-c4df-42a0-8c04-6912e8583e5b" (UID: "a0e6863d-c4df-42a0-8c04-6912e8583e5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.149986 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.150007 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e6863d-c4df-42a0-8c04-6912e8583e5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.150018 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rqc7\" (UniqueName: \"kubernetes.io/projected/a0e6863d-c4df-42a0-8c04-6912e8583e5b-kube-api-access-8rqc7\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.172623 5033 scope.go:117] "RemoveContainer" containerID="32f63e2d631e22a29cb5e9a0f805b4f1768663cf02dc917a320563d97dd96a44" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.223898 5033 scope.go:117] "RemoveContainer" containerID="c10cf197b625dad141171c12b57401e6aa5ec1dace95c2874e1d04067d01cdd6" Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.468336 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.481097 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rgqzm"] Mar 19 20:23:00 crc kubenswrapper[5033]: I0319 20:23:00.640725 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" path="/var/lib/kubelet/pods/a0e6863d-c4df-42a0-8c04-6912e8583e5b/volumes" Mar 19 20:23:07 crc kubenswrapper[5033]: I0319 20:23:07.863478 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:07 crc kubenswrapper[5033]: I0319 20:23:07.864154 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:07 crc kubenswrapper[5033]: I0319 20:23:07.930518 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:08 crc kubenswrapper[5033]: I0319 20:23:08.274505 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:08 crc kubenswrapper[5033]: I0319 20:23:08.336587 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:23:10 crc kubenswrapper[5033]: I0319 20:23:10.230605 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zn2mr" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="registry-server" containerID="cri-o://1a20f8cefac21dd9757779ccefb04b2424644c1cd05f2d67fd9a93b348599f44" gracePeriod=2 Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.241433 5033 generic.go:334] "Generic (PLEG): container finished" podID="452a9295-a6f3-4d20-8315-882690326d72" containerID="1a20f8cefac21dd9757779ccefb04b2424644c1cd05f2d67fd9a93b348599f44" exitCode=0 Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.241488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerDied","Data":"1a20f8cefac21dd9757779ccefb04b2424644c1cd05f2d67fd9a93b348599f44"} Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.484562 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.576400 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content\") pod \"452a9295-a6f3-4d20-8315-882690326d72\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.576469 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities\") pod \"452a9295-a6f3-4d20-8315-882690326d72\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.576492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-878wj\" (UniqueName: \"kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj\") pod \"452a9295-a6f3-4d20-8315-882690326d72\" (UID: \"452a9295-a6f3-4d20-8315-882690326d72\") " Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.577516 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities" (OuterVolumeSpecName: "utilities") pod "452a9295-a6f3-4d20-8315-882690326d72" (UID: "452a9295-a6f3-4d20-8315-882690326d72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.592658 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj" (OuterVolumeSpecName: "kube-api-access-878wj") pod "452a9295-a6f3-4d20-8315-882690326d72" (UID: "452a9295-a6f3-4d20-8315-882690326d72"). InnerVolumeSpecName "kube-api-access-878wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.678738 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.678990 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-878wj\" (UniqueName: \"kubernetes.io/projected/452a9295-a6f3-4d20-8315-882690326d72-kube-api-access-878wj\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.712492 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "452a9295-a6f3-4d20-8315-882690326d72" (UID: "452a9295-a6f3-4d20-8315-882690326d72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:11 crc kubenswrapper[5033]: I0319 20:23:11.780600 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452a9295-a6f3-4d20-8315-882690326d72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.254502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn2mr" event={"ID":"452a9295-a6f3-4d20-8315-882690326d72","Type":"ContainerDied","Data":"f961037011438b3967c4c5e18709c062f64c1a982438b41ddf3a5f1d641e04f2"} Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.254864 5033 scope.go:117] "RemoveContainer" containerID="1a20f8cefac21dd9757779ccefb04b2424644c1cd05f2d67fd9a93b348599f44" Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.254568 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn2mr" Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.280754 5033 scope.go:117] "RemoveContainer" containerID="edf7d5682adead4c9df8d4062d19d04b6e78f176ea013e3be7fb2c718cdcc589" Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.311410 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.330155 5033 scope.go:117] "RemoveContainer" containerID="31915f57bb0cab8381780b84c4aceb3c28a3c8c1b03b5546d12888d1dcd0d4a6" Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.333038 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zn2mr"] Mar 19 20:23:12 crc kubenswrapper[5033]: I0319 20:23:12.630849 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452a9295-a6f3-4d20-8315-882690326d72" path="/var/lib/kubelet/pods/452a9295-a6f3-4d20-8315-882690326d72/volumes" Mar 19 20:23:18 crc kubenswrapper[5033]: I0319 20:23:18.924432 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-577fb74b9f-b68t5_0bff649f-6d36-42eb-8419-aebbe076b40c/prometheus-operator-admission-webhook/0.log" Mar 19 20:23:18 crc kubenswrapper[5033]: I0319 20:23:18.933763 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-hr4xj_2f90a5c2-7618-4090-b63c-6d40664ab26e/prometheus-operator/0.log" Mar 19 20:23:19 crc kubenswrapper[5033]: I0319 20:23:19.127136 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-577fb74b9f-w9sjj_88b04cc7-0103-4c0a-bd35-421e81888064/prometheus-operator-admission-webhook/0.log" Mar 19 20:23:19 crc kubenswrapper[5033]: I0319 20:23:19.492897 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-558f99686-58c2z_5f1646e5-4c53-4c2f-93f0-b32741aa44ae/perses-operator/0.log" Mar 19 20:23:19 crc kubenswrapper[5033]: I0319 20:23:19.516047 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-v94xx_92efd1e2-4b4b-48b0-a991-1c3cfe62eef3/operator/0.log" Mar 19 20:23:48 crc kubenswrapper[5033]: I0319 20:23:48.450121 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-767b88fbc9-6bv6n_25d27288-ba82-4c74-a864-b5e54e4be246/kube-rbac-proxy/0.log" Mar 19 20:23:48 crc kubenswrapper[5033]: I0319 20:23:48.702366 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-767b88fbc9-6bv6n_25d27288-ba82-4c74-a864-b5e54e4be246/manager/0.log" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.146839 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565864-x7glf"] Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147705 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147717 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147736 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="extract-utilities" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147742 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="extract-utilities" Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147755 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="extract-content" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147761 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="extract-content" Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147787 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="extract-content" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147793 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="extract-content" Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147803 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147809 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: E0319 20:24:00.147820 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="extract-utilities" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.147826 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="extract-utilities" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.148020 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e6863d-c4df-42a0-8c04-6912e8583e5b" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.148037 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="452a9295-a6f3-4d20-8315-882690326d72" containerName="registry-server" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.148715 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.157353 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.157395 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.164718 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.165148 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-x7glf"] Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.262520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcwlv\" (UniqueName: \"kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv\") pod \"auto-csr-approver-29565864-x7glf\" (UID: \"c27f4e2c-782c-4b13-8c36-d637b5f9f55c\") " pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.364677 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcwlv\" (UniqueName: \"kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv\") pod \"auto-csr-approver-29565864-x7glf\" (UID: \"c27f4e2c-782c-4b13-8c36-d637b5f9f55c\") " pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.587255 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcwlv\" (UniqueName: \"kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv\") pod \"auto-csr-approver-29565864-x7glf\" (UID: \"c27f4e2c-782c-4b13-8c36-d637b5f9f55c\") " pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:00 crc kubenswrapper[5033]: I0319 20:24:00.769292 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:01 crc kubenswrapper[5033]: I0319 20:24:01.620960 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-x7glf"] Mar 19 20:24:01 crc kubenswrapper[5033]: I0319 20:24:01.638968 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:24:01 crc kubenswrapper[5033]: I0319 20:24:01.772441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-x7glf" event={"ID":"c27f4e2c-782c-4b13-8c36-d637b5f9f55c","Type":"ContainerStarted","Data":"2abdd49ff0fe7523df333a4439da7e92d9e789c2a1f8efc9ce183c4d050c50af"} Mar 19 20:24:03 crc kubenswrapper[5033]: I0319 20:24:03.797504 5033 generic.go:334] "Generic (PLEG): container finished" podID="c27f4e2c-782c-4b13-8c36-d637b5f9f55c" containerID="90804c370e7cf0012f8cd253d460268e8cbb78afd0e7e3a1b65df583ecc71a04" exitCode=0 Mar 19 20:24:03 crc kubenswrapper[5033]: I0319 20:24:03.797588 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-x7glf" event={"ID":"c27f4e2c-782c-4b13-8c36-d637b5f9f55c","Type":"ContainerDied","Data":"90804c370e7cf0012f8cd253d460268e8cbb78afd0e7e3a1b65df583ecc71a04"} Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.090956 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.201384 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcwlv\" (UniqueName: \"kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv\") pod \"c27f4e2c-782c-4b13-8c36-d637b5f9f55c\" (UID: \"c27f4e2c-782c-4b13-8c36-d637b5f9f55c\") " Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.230646 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv" (OuterVolumeSpecName: "kube-api-access-kcwlv") pod "c27f4e2c-782c-4b13-8c36-d637b5f9f55c" (UID: "c27f4e2c-782c-4b13-8c36-d637b5f9f55c"). InnerVolumeSpecName "kube-api-access-kcwlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.305043 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcwlv\" (UniqueName: \"kubernetes.io/projected/c27f4e2c-782c-4b13-8c36-d637b5f9f55c-kube-api-access-kcwlv\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.838550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-x7glf" event={"ID":"c27f4e2c-782c-4b13-8c36-d637b5f9f55c","Type":"ContainerDied","Data":"2abdd49ff0fe7523df333a4439da7e92d9e789c2a1f8efc9ce183c4d050c50af"} Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.838586 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2abdd49ff0fe7523df333a4439da7e92d9e789c2a1f8efc9ce183c4d050c50af" Mar 19 20:24:06 crc kubenswrapper[5033]: I0319 20:24:06.838634 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-x7glf" Mar 19 20:24:07 crc kubenswrapper[5033]: I0319 20:24:07.229871 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-9cbbx"] Mar 19 20:24:07 crc kubenswrapper[5033]: I0319 20:24:07.252614 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-9cbbx"] Mar 19 20:24:08 crc kubenswrapper[5033]: I0319 20:24:08.648853 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58931cc-765c-4d48-8a5f-239fee20e2c0" path="/var/lib/kubelet/pods/b58931cc-765c-4d48-8a5f-239fee20e2c0/volumes" Mar 19 20:24:40 crc kubenswrapper[5033]: I0319 20:24:40.758789 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:24:40 crc kubenswrapper[5033]: I0319 20:24:40.759323 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:24:42 crc kubenswrapper[5033]: I0319 20:24:42.631797 5033 scope.go:117] "RemoveContainer" containerID="9219efb981eab54a08733e694f05300ee547043fd48807932253175a8fe81cef" Mar 19 20:25:10 crc kubenswrapper[5033]: I0319 20:25:10.758516 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:25:10 crc kubenswrapper[5033]: I0319 20:25:10.759123 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:25:40 crc kubenswrapper[5033]: I0319 20:25:40.758590 5033 patch_prober.go:28] interesting pod/machine-config-daemon-779xw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:25:40 crc kubenswrapper[5033]: I0319 20:25:40.759120 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:25:40 crc kubenswrapper[5033]: I0319 20:25:40.759161 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-779xw" Mar 19 20:25:40 crc kubenswrapper[5033]: I0319 20:25:40.759917 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646"} pod="openshift-machine-config-operator/machine-config-daemon-779xw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:25:40 crc kubenswrapper[5033]: I0319 20:25:40.759966 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerName="machine-config-daemon" containerID="cri-o://726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" gracePeriod=600 Mar 19 20:25:40 crc kubenswrapper[5033]: E0319 20:25:40.882281 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:25:41 crc kubenswrapper[5033]: I0319 20:25:41.718853 5033 generic.go:334] "Generic (PLEG): container finished" podID="c960a9d1-3c99-4e77-9906-e319e0aed817" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" exitCode=0 Mar 19 20:25:41 crc kubenswrapper[5033]: I0319 20:25:41.719398 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerDied","Data":"726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646"} Mar 19 20:25:41 crc kubenswrapper[5033]: I0319 20:25:41.719505 5033 scope.go:117] "RemoveContainer" containerID="f99ad33d5fef7ef97cb71b16f0f7d2ee1af4ff25f11d5566e4c59a37afdfca20" Mar 19 20:25:41 crc kubenswrapper[5033]: I0319 20:25:41.720215 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:25:41 crc kubenswrapper[5033]: E0319 20:25:41.720539 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:25:56 crc kubenswrapper[5033]: I0319 20:25:56.621141 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:25:56 crc kubenswrapper[5033]: E0319 20:25:56.621830 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.146722 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565866-zjdlp"] Mar 19 20:26:00 crc kubenswrapper[5033]: E0319 20:26:00.157986 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27f4e2c-782c-4b13-8c36-d637b5f9f55c" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.158004 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27f4e2c-782c-4b13-8c36-d637b5f9f55c" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.158237 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27f4e2c-782c-4b13-8c36-d637b5f9f55c" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.158866 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-zjdlp"] Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.158934 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.164834 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.165215 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.165352 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.230862 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rxn\" (UniqueName: \"kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn\") pod \"auto-csr-approver-29565866-zjdlp\" (UID: \"9bec14ec-4cd8-4547-a5d5-b75084370710\") " pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.332900 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8rxn\" (UniqueName: \"kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn\") pod \"auto-csr-approver-29565866-zjdlp\" (UID: \"9bec14ec-4cd8-4547-a5d5-b75084370710\") " pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.354700 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8rxn\" (UniqueName: \"kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn\") pod \"auto-csr-approver-29565866-zjdlp\" (UID: \"9bec14ec-4cd8-4547-a5d5-b75084370710\") " pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:00 crc kubenswrapper[5033]: I0319 20:26:00.499908 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:01 crc kubenswrapper[5033]: I0319 20:26:01.322913 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-zjdlp"] Mar 19 20:26:01 crc kubenswrapper[5033]: I0319 20:26:01.940432 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" event={"ID":"9bec14ec-4cd8-4547-a5d5-b75084370710","Type":"ContainerStarted","Data":"483bf24798397847c54975b63e3cdc6f921c419921c65617eac72290bc83b336"} Mar 19 20:26:02 crc kubenswrapper[5033]: I0319 20:26:02.951109 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" event={"ID":"9bec14ec-4cd8-4547-a5d5-b75084370710","Type":"ContainerStarted","Data":"caa972ca99d11685d9d26d57d8761a39f0527f6e8c82364f449f62b6027aba88"} Mar 19 20:26:03 crc kubenswrapper[5033]: I0319 20:26:03.021303 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" podStartSLOduration=2.139176882 podStartE2EDuration="3.021280966s" podCreationTimestamp="2026-03-19 20:26:00 +0000 UTC" firstStartedPulling="2026-03-19 20:26:01.318624987 +0000 UTC m=+5371.423654846" lastFinishedPulling="2026-03-19 20:26:02.200729081 +0000 UTC m=+5372.305758930" observedRunningTime="2026-03-19 20:26:02.963182375 +0000 UTC m=+5373.068212224" watchObservedRunningTime="2026-03-19 20:26:03.021280966 +0000 UTC m=+5373.126310815" Mar 19 20:26:03 crc kubenswrapper[5033]: I0319 20:26:03.961554 5033 generic.go:334] "Generic (PLEG): container finished" podID="9bec14ec-4cd8-4547-a5d5-b75084370710" containerID="caa972ca99d11685d9d26d57d8761a39f0527f6e8c82364f449f62b6027aba88" exitCode=0 Mar 19 20:26:03 crc kubenswrapper[5033]: I0319 20:26:03.961598 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" event={"ID":"9bec14ec-4cd8-4547-a5d5-b75084370710","Type":"ContainerDied","Data":"caa972ca99d11685d9d26d57d8761a39f0527f6e8c82364f449f62b6027aba88"} Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.184103 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.365197 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8rxn\" (UniqueName: \"kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn\") pod \"9bec14ec-4cd8-4547-a5d5-b75084370710\" (UID: \"9bec14ec-4cd8-4547-a5d5-b75084370710\") " Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.370712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn" (OuterVolumeSpecName: "kube-api-access-t8rxn") pod "9bec14ec-4cd8-4547-a5d5-b75084370710" (UID: "9bec14ec-4cd8-4547-a5d5-b75084370710"). InnerVolumeSpecName "kube-api-access-t8rxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.467859 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8rxn\" (UniqueName: \"kubernetes.io/projected/9bec14ec-4cd8-4547-a5d5-b75084370710-kube-api-access-t8rxn\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.999087 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" event={"ID":"9bec14ec-4cd8-4547-a5d5-b75084370710","Type":"ContainerDied","Data":"483bf24798397847c54975b63e3cdc6f921c419921c65617eac72290bc83b336"} Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.999362 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="483bf24798397847c54975b63e3cdc6f921c419921c65617eac72290bc83b336" Mar 19 20:26:06 crc kubenswrapper[5033]: I0319 20:26:06.999157 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-zjdlp" Mar 19 20:26:07 crc kubenswrapper[5033]: I0319 20:26:07.247840 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-tz5nj"] Mar 19 20:26:07 crc kubenswrapper[5033]: I0319 20:26:07.257673 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-tz5nj"] Mar 19 20:26:08 crc kubenswrapper[5033]: I0319 20:26:08.632225 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a" path="/var/lib/kubelet/pods/d0c16eb5-ee30-4a39-b8d4-f399a5c4bb9a/volumes" Mar 19 20:26:09 crc kubenswrapper[5033]: I0319 20:26:09.620932 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:26:09 crc kubenswrapper[5033]: E0319 20:26:09.621200 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:26:24 crc kubenswrapper[5033]: I0319 20:26:24.620819 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:26:24 crc kubenswrapper[5033]: E0319 20:26:24.621493 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:26:27 crc kubenswrapper[5033]: I0319 20:26:27.184668 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerID="7f1e0073f4bff60859d00487fffe3863027d69eed8a28138234934c2a894bc1d" exitCode=0 Mar 19 20:26:27 crc kubenswrapper[5033]: I0319 20:26:27.184749 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" event={"ID":"9e95d984-9a67-460b-a101-cfea0b90a4d5","Type":"ContainerDied","Data":"7f1e0073f4bff60859d00487fffe3863027d69eed8a28138234934c2a894bc1d"} Mar 19 20:26:27 crc kubenswrapper[5033]: I0319 20:26:27.185600 5033 scope.go:117] "RemoveContainer" containerID="7f1e0073f4bff60859d00487fffe3863027d69eed8a28138234934c2a894bc1d" Mar 19 20:26:27 crc kubenswrapper[5033]: I0319 20:26:27.279395 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sdzkv_must-gather-5p5h2_9e95d984-9a67-460b-a101-cfea0b90a4d5/gather/0.log" Mar 19 20:26:35 crc kubenswrapper[5033]: I0319 20:26:35.941478 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sdzkv/must-gather-5p5h2"] Mar 19 20:26:35 crc kubenswrapper[5033]: I0319 20:26:35.943575 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="copy" containerID="cri-o://e225c7d2fa521bf56f8511b7bd7e707aab7cb341784689fe1f9e0dc4fcb40e31" gracePeriod=2 Mar 19 20:26:35 crc kubenswrapper[5033]: I0319 20:26:35.951576 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sdzkv/must-gather-5p5h2"] Mar 19 20:26:36 crc kubenswrapper[5033]: I0319 20:26:36.315415 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sdzkv_must-gather-5p5h2_9e95d984-9a67-460b-a101-cfea0b90a4d5/copy/0.log" Mar 19 20:26:36 crc kubenswrapper[5033]: I0319 20:26:36.316281 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerID="e225c7d2fa521bf56f8511b7bd7e707aab7cb341784689fe1f9e0dc4fcb40e31" exitCode=143 Mar 19 20:26:36 crc kubenswrapper[5033]: I0319 20:26:36.620396 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:26:36 crc kubenswrapper[5033]: E0319 20:26:36.621000 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.040660 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sdzkv_must-gather-5p5h2_9e95d984-9a67-460b-a101-cfea0b90a4d5/copy/0.log" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.041113 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.185546 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output\") pod \"9e95d984-9a67-460b-a101-cfea0b90a4d5\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.185716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g67w\" (UniqueName: \"kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w\") pod \"9e95d984-9a67-460b-a101-cfea0b90a4d5\" (UID: \"9e95d984-9a67-460b-a101-cfea0b90a4d5\") " Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.191615 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w" (OuterVolumeSpecName: "kube-api-access-8g67w") pod "9e95d984-9a67-460b-a101-cfea0b90a4d5" (UID: "9e95d984-9a67-460b-a101-cfea0b90a4d5"). InnerVolumeSpecName "kube-api-access-8g67w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.288543 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g67w\" (UniqueName: \"kubernetes.io/projected/9e95d984-9a67-460b-a101-cfea0b90a4d5-kube-api-access-8g67w\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.330829 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sdzkv_must-gather-5p5h2_9e95d984-9a67-460b-a101-cfea0b90a4d5/copy/0.log" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.331961 5033 scope.go:117] "RemoveContainer" containerID="e225c7d2fa521bf56f8511b7bd7e707aab7cb341784689fe1f9e0dc4fcb40e31" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.332133 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sdzkv/must-gather-5p5h2" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.360213 5033 scope.go:117] "RemoveContainer" containerID="7f1e0073f4bff60859d00487fffe3863027d69eed8a28138234934c2a894bc1d" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.379072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9e95d984-9a67-460b-a101-cfea0b90a4d5" (UID: "9e95d984-9a67-460b-a101-cfea0b90a4d5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:37 crc kubenswrapper[5033]: I0319 20:26:37.390861 5033 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e95d984-9a67-460b-a101-cfea0b90a4d5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:38 crc kubenswrapper[5033]: I0319 20:26:38.631968 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" path="/var/lib/kubelet/pods/9e95d984-9a67-460b-a101-cfea0b90a4d5/volumes" Mar 19 20:26:42 crc kubenswrapper[5033]: I0319 20:26:42.735483 5033 scope.go:117] "RemoveContainer" containerID="d8e0976a0bd2b50e40009777579e40b5c01e00acf3e6ae39fb752835c5aedb46" Mar 19 20:26:48 crc kubenswrapper[5033]: I0319 20:26:48.620522 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:26:48 crc kubenswrapper[5033]: E0319 20:26:48.621222 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.953187 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:26:50 crc kubenswrapper[5033]: E0319 20:26:50.954155 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bec14ec-4cd8-4547-a5d5-b75084370710" containerName="oc" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954172 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bec14ec-4cd8-4547-a5d5-b75084370710" containerName="oc" Mar 19 20:26:50 crc kubenswrapper[5033]: E0319 20:26:50.954191 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="gather" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="gather" Mar 19 20:26:50 crc kubenswrapper[5033]: E0319 20:26:50.954219 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="copy" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954227 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="copy" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954444 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="gather" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954489 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e95d984-9a67-460b-a101-cfea0b90a4d5" containerName="copy" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.954505 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bec14ec-4cd8-4547-a5d5-b75084370710" containerName="oc" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.955983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:50 crc kubenswrapper[5033]: I0319 20:26:50.974387 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.063234 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.063289 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f82cd\" (UniqueName: \"kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.063397 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.164806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.164980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.165013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f82cd\" (UniqueName: \"kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.166166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.166211 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.192135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f82cd\" (UniqueName: \"kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd\") pod \"certified-operators-svzdh\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:51 crc kubenswrapper[5033]: I0319 20:26:51.286507 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:26:52 crc kubenswrapper[5033]: I0319 20:26:52.054739 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:26:52 crc kubenswrapper[5033]: I0319 20:26:52.467675 5033 generic.go:334] "Generic (PLEG): container finished" podID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerID="55a28ec2ffe3a3cd2d22e63b23e3199ca2af498191e4bd47105b71e4212ce30d" exitCode=0 Mar 19 20:26:52 crc kubenswrapper[5033]: I0319 20:26:52.467760 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerDied","Data":"55a28ec2ffe3a3cd2d22e63b23e3199ca2af498191e4bd47105b71e4212ce30d"} Mar 19 20:26:52 crc kubenswrapper[5033]: I0319 20:26:52.467980 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerStarted","Data":"a17056be1a2d9b69d492513e085e89d424192b293a77e677f232bdce17543332"} Mar 19 20:26:53 crc kubenswrapper[5033]: I0319 20:26:53.478180 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerStarted","Data":"74d31698c01fe5a6662bb3ed2fae09c1fb4ee2b800097cba879e2873048fa9a8"} Mar 19 20:26:55 crc kubenswrapper[5033]: I0319 20:26:55.505970 5033 generic.go:334] "Generic (PLEG): container finished" podID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerID="74d31698c01fe5a6662bb3ed2fae09c1fb4ee2b800097cba879e2873048fa9a8" exitCode=0 Mar 19 20:26:55 crc kubenswrapper[5033]: I0319 20:26:55.506033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerDied","Data":"74d31698c01fe5a6662bb3ed2fae09c1fb4ee2b800097cba879e2873048fa9a8"} Mar 19 20:26:56 crc kubenswrapper[5033]: I0319 20:26:56.517894 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerStarted","Data":"97d577b014a41e19b6f5bf21ba673d467ed7c15dc7c9438002ea5d48b5557f53"} Mar 19 20:26:56 crc kubenswrapper[5033]: I0319 20:26:56.543020 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-svzdh" podStartSLOduration=2.978130933 podStartE2EDuration="6.542998971s" podCreationTimestamp="2026-03-19 20:26:50 +0000 UTC" firstStartedPulling="2026-03-19 20:26:52.469344734 +0000 UTC m=+5422.574374583" lastFinishedPulling="2026-03-19 20:26:56.034212772 +0000 UTC m=+5426.139242621" observedRunningTime="2026-03-19 20:26:56.537019233 +0000 UTC m=+5426.642049102" watchObservedRunningTime="2026-03-19 20:26:56.542998971 +0000 UTC m=+5426.648028820" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.287180 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.288515 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.343378 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.595872 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.600425 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.617027 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.624595 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:27:01 crc kubenswrapper[5033]: E0319 20:27:01.631319 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.657979 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.749660 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.749902 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5gx\" (UniqueName: \"kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.750188 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.851736 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.851890 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.851933 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5gx\" (UniqueName: \"kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.852670 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.852886 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.882026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5gx\" (UniqueName: \"kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx\") pod \"redhat-marketplace-88sxb\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:01 crc kubenswrapper[5033]: I0319 20:27:01.933251 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:02 crc kubenswrapper[5033]: I0319 20:27:02.883042 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:03 crc kubenswrapper[5033]: I0319 20:27:03.590228 5033 generic.go:334] "Generic (PLEG): container finished" podID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerID="c4153dfb4df03d905738b6e5e50a27d3f0e247dd0845dcc5a64574231a30fdf1" exitCode=0 Mar 19 20:27:03 crc kubenswrapper[5033]: I0319 20:27:03.590269 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerDied","Data":"c4153dfb4df03d905738b6e5e50a27d3f0e247dd0845dcc5a64574231a30fdf1"} Mar 19 20:27:03 crc kubenswrapper[5033]: I0319 20:27:03.590508 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerStarted","Data":"fa1ca07b9f1f0659551f32a96b65ba3f0c09fec0267b0ba322240e825870c265"} Mar 19 20:27:03 crc kubenswrapper[5033]: I0319 20:27:03.982141 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:27:03 crc kubenswrapper[5033]: I0319 20:27:03.982384 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-svzdh" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="registry-server" containerID="cri-o://97d577b014a41e19b6f5bf21ba673d467ed7c15dc7c9438002ea5d48b5557f53" gracePeriod=2 Mar 19 20:27:04 crc kubenswrapper[5033]: I0319 20:27:04.600668 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerStarted","Data":"38bf46fe927d31ce86e6b8d0075d6b0839baf0813a054098f3c83896818c3bc4"} Mar 19 20:27:04 crc kubenswrapper[5033]: I0319 20:27:04.602993 5033 generic.go:334] "Generic (PLEG): container finished" podID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerID="97d577b014a41e19b6f5bf21ba673d467ed7c15dc7c9438002ea5d48b5557f53" exitCode=0 Mar 19 20:27:04 crc kubenswrapper[5033]: I0319 20:27:04.603028 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerDied","Data":"97d577b014a41e19b6f5bf21ba673d467ed7c15dc7c9438002ea5d48b5557f53"} Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.414437 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.535144 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content\") pod \"528ec436-9121-42d1-8f87-ac8fdab6af1f\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.535642 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f82cd\" (UniqueName: \"kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd\") pod \"528ec436-9121-42d1-8f87-ac8fdab6af1f\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.535757 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities\") pod \"528ec436-9121-42d1-8f87-ac8fdab6af1f\" (UID: \"528ec436-9121-42d1-8f87-ac8fdab6af1f\") " Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.540115 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities" (OuterVolumeSpecName: "utilities") pod "528ec436-9121-42d1-8f87-ac8fdab6af1f" (UID: "528ec436-9121-42d1-8f87-ac8fdab6af1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.566651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd" (OuterVolumeSpecName: "kube-api-access-f82cd") pod "528ec436-9121-42d1-8f87-ac8fdab6af1f" (UID: "528ec436-9121-42d1-8f87-ac8fdab6af1f"). InnerVolumeSpecName "kube-api-access-f82cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.635568 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svzdh" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.636172 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svzdh" event={"ID":"528ec436-9121-42d1-8f87-ac8fdab6af1f","Type":"ContainerDied","Data":"a17056be1a2d9b69d492513e085e89d424192b293a77e677f232bdce17543332"} Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.636220 5033 scope.go:117] "RemoveContainer" containerID="97d577b014a41e19b6f5bf21ba673d467ed7c15dc7c9438002ea5d48b5557f53" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.638181 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f82cd\" (UniqueName: \"kubernetes.io/projected/528ec436-9121-42d1-8f87-ac8fdab6af1f-kube-api-access-f82cd\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.638210 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.668871 5033 scope.go:117] "RemoveContainer" containerID="74d31698c01fe5a6662bb3ed2fae09c1fb4ee2b800097cba879e2873048fa9a8" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.677056 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "528ec436-9121-42d1-8f87-ac8fdab6af1f" (UID: "528ec436-9121-42d1-8f87-ac8fdab6af1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.706356 5033 scope.go:117] "RemoveContainer" containerID="55a28ec2ffe3a3cd2d22e63b23e3199ca2af498191e4bd47105b71e4212ce30d" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.740144 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528ec436-9121-42d1-8f87-ac8fdab6af1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.967030 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:27:05 crc kubenswrapper[5033]: I0319 20:27:05.975621 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-svzdh"] Mar 19 20:27:06 crc kubenswrapper[5033]: I0319 20:27:06.633835 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" path="/var/lib/kubelet/pods/528ec436-9121-42d1-8f87-ac8fdab6af1f/volumes" Mar 19 20:27:06 crc kubenswrapper[5033]: I0319 20:27:06.646507 5033 generic.go:334] "Generic (PLEG): container finished" podID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerID="38bf46fe927d31ce86e6b8d0075d6b0839baf0813a054098f3c83896818c3bc4" exitCode=0 Mar 19 20:27:06 crc kubenswrapper[5033]: I0319 20:27:06.646573 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerDied","Data":"38bf46fe927d31ce86e6b8d0075d6b0839baf0813a054098f3c83896818c3bc4"} Mar 19 20:27:07 crc kubenswrapper[5033]: I0319 20:27:07.664133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerStarted","Data":"4c454033c23d7c5338c0eb5be8c1cd8f02f2e6ddcc210836a7a31c45a3667fdc"} Mar 19 20:27:07 crc kubenswrapper[5033]: I0319 20:27:07.699385 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88sxb" podStartSLOduration=3.194083259 podStartE2EDuration="6.699362423s" podCreationTimestamp="2026-03-19 20:27:01 +0000 UTC" firstStartedPulling="2026-03-19 20:27:03.592223646 +0000 UTC m=+5433.697253495" lastFinishedPulling="2026-03-19 20:27:07.09750281 +0000 UTC m=+5437.202532659" observedRunningTime="2026-03-19 20:27:07.683990172 +0000 UTC m=+5437.789020011" watchObservedRunningTime="2026-03-19 20:27:07.699362423 +0000 UTC m=+5437.804392272" Mar 19 20:27:11 crc kubenswrapper[5033]: I0319 20:27:11.934387 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:11 crc kubenswrapper[5033]: I0319 20:27:11.934910 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:11 crc kubenswrapper[5033]: I0319 20:27:11.991426 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:12 crc kubenswrapper[5033]: I0319 20:27:12.803974 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:12 crc kubenswrapper[5033]: I0319 20:27:12.854357 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:14 crc kubenswrapper[5033]: I0319 20:27:14.620341 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:27:14 crc kubenswrapper[5033]: E0319 20:27:14.620868 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:27:14 crc kubenswrapper[5033]: I0319 20:27:14.772382 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88sxb" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="registry-server" containerID="cri-o://4c454033c23d7c5338c0eb5be8c1cd8f02f2e6ddcc210836a7a31c45a3667fdc" gracePeriod=2 Mar 19 20:27:15 crc kubenswrapper[5033]: I0319 20:27:15.784836 5033 generic.go:334] "Generic (PLEG): container finished" podID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerID="4c454033c23d7c5338c0eb5be8c1cd8f02f2e6ddcc210836a7a31c45a3667fdc" exitCode=0 Mar 19 20:27:15 crc kubenswrapper[5033]: I0319 20:27:15.784918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerDied","Data":"4c454033c23d7c5338c0eb5be8c1cd8f02f2e6ddcc210836a7a31c45a3667fdc"} Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.003921 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.042332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx5gx\" (UniqueName: \"kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx\") pod \"8edfb269-9105-4aee-bdf3-2a4e223824c9\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.042445 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities\") pod \"8edfb269-9105-4aee-bdf3-2a4e223824c9\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.042545 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content\") pod \"8edfb269-9105-4aee-bdf3-2a4e223824c9\" (UID: \"8edfb269-9105-4aee-bdf3-2a4e223824c9\") " Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.052399 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities" (OuterVolumeSpecName: "utilities") pod "8edfb269-9105-4aee-bdf3-2a4e223824c9" (UID: "8edfb269-9105-4aee-bdf3-2a4e223824c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.052443 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx" (OuterVolumeSpecName: "kube-api-access-tx5gx") pod "8edfb269-9105-4aee-bdf3-2a4e223824c9" (UID: "8edfb269-9105-4aee-bdf3-2a4e223824c9"). InnerVolumeSpecName "kube-api-access-tx5gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.071743 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8edfb269-9105-4aee-bdf3-2a4e223824c9" (UID: "8edfb269-9105-4aee-bdf3-2a4e223824c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.145791 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.145825 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8edfb269-9105-4aee-bdf3-2a4e223824c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.145840 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx5gx\" (UniqueName: \"kubernetes.io/projected/8edfb269-9105-4aee-bdf3-2a4e223824c9-kube-api-access-tx5gx\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.794951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88sxb" event={"ID":"8edfb269-9105-4aee-bdf3-2a4e223824c9","Type":"ContainerDied","Data":"fa1ca07b9f1f0659551f32a96b65ba3f0c09fec0267b0ba322240e825870c265"} Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.794995 5033 scope.go:117] "RemoveContainer" containerID="4c454033c23d7c5338c0eb5be8c1cd8f02f2e6ddcc210836a7a31c45a3667fdc" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.795114 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88sxb" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.818097 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.821076 5033 scope.go:117] "RemoveContainer" containerID="38bf46fe927d31ce86e6b8d0075d6b0839baf0813a054098f3c83896818c3bc4" Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.835815 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88sxb"] Mar 19 20:27:16 crc kubenswrapper[5033]: I0319 20:27:16.854873 5033 scope.go:117] "RemoveContainer" containerID="c4153dfb4df03d905738b6e5e50a27d3f0e247dd0845dcc5a64574231a30fdf1" Mar 19 20:27:18 crc kubenswrapper[5033]: I0319 20:27:18.631683 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" path="/var/lib/kubelet/pods/8edfb269-9105-4aee-bdf3-2a4e223824c9/volumes" Mar 19 20:27:29 crc kubenswrapper[5033]: I0319 20:27:29.621544 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:27:29 crc kubenswrapper[5033]: E0319 20:27:29.622228 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:27:43 crc kubenswrapper[5033]: I0319 20:27:43.620386 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:27:43 crc kubenswrapper[5033]: E0319 20:27:43.621267 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:27:55 crc kubenswrapper[5033]: I0319 20:27:55.620722 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:27:55 crc kubenswrapper[5033]: E0319 20:27:55.621406 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.147561 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565868-pbvvx"] Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148492 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148504 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148519 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148526 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148547 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148553 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148569 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148575 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148586 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148591 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[5033]: E0319 20:28:00.148601 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148607 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148796 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="528ec436-9121-42d1-8f87-ac8fdab6af1f" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.148825 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edfb269-9105-4aee-bdf3-2a4e223824c9" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.149634 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.151515 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.151992 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.152550 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.158689 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-pbvvx"] Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.304480 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5k5\" (UniqueName: \"kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5\") pod \"auto-csr-approver-29565868-pbvvx\" (UID: \"822c7653-7791-42b9-823c-cc2ce1e27863\") " pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.406497 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5k5\" (UniqueName: \"kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5\") pod \"auto-csr-approver-29565868-pbvvx\" (UID: \"822c7653-7791-42b9-823c-cc2ce1e27863\") " pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.427236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5k5\" (UniqueName: \"kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5\") pod \"auto-csr-approver-29565868-pbvvx\" (UID: \"822c7653-7791-42b9-823c-cc2ce1e27863\") " pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:00 crc kubenswrapper[5033]: I0319 20:28:00.469803 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:01 crc kubenswrapper[5033]: I0319 20:28:01.225034 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-pbvvx"] Mar 19 20:28:02 crc kubenswrapper[5033]: I0319 20:28:02.188278 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" event={"ID":"822c7653-7791-42b9-823c-cc2ce1e27863","Type":"ContainerStarted","Data":"1f5a50e09895d008721c3a36593acfd4c28a68858d63e83185957bac8f17199f"} Mar 19 20:28:03 crc kubenswrapper[5033]: I0319 20:28:03.197434 5033 generic.go:334] "Generic (PLEG): container finished" podID="822c7653-7791-42b9-823c-cc2ce1e27863" containerID="0899d6191281ccf84df4363dff2cc343fb4a9420bd829dc1004987119482c7f2" exitCode=0 Mar 19 20:28:03 crc kubenswrapper[5033]: I0319 20:28:03.197510 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" event={"ID":"822c7653-7791-42b9-823c-cc2ce1e27863","Type":"ContainerDied","Data":"0899d6191281ccf84df4363dff2cc343fb4a9420bd829dc1004987119482c7f2"} Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.217006 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" event={"ID":"822c7653-7791-42b9-823c-cc2ce1e27863","Type":"ContainerDied","Data":"1f5a50e09895d008721c3a36593acfd4c28a68858d63e83185957bac8f17199f"} Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.217449 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5a50e09895d008721c3a36593acfd4c28a68858d63e83185957bac8f17199f" Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.272774 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.343855 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr5k5\" (UniqueName: \"kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5\") pod \"822c7653-7791-42b9-823c-cc2ce1e27863\" (UID: \"822c7653-7791-42b9-823c-cc2ce1e27863\") " Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.355678 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5" (OuterVolumeSpecName: "kube-api-access-qr5k5") pod "822c7653-7791-42b9-823c-cc2ce1e27863" (UID: "822c7653-7791-42b9-823c-cc2ce1e27863"). InnerVolumeSpecName "kube-api-access-qr5k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:28:05 crc kubenswrapper[5033]: I0319 20:28:05.446732 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr5k5\" (UniqueName: \"kubernetes.io/projected/822c7653-7791-42b9-823c-cc2ce1e27863-kube-api-access-qr5k5\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:06 crc kubenswrapper[5033]: I0319 20:28:06.225059 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-pbvvx" Mar 19 20:28:06 crc kubenswrapper[5033]: I0319 20:28:06.354546 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-jd6m9"] Mar 19 20:28:06 crc kubenswrapper[5033]: I0319 20:28:06.364208 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-jd6m9"] Mar 19 20:28:06 crc kubenswrapper[5033]: I0319 20:28:06.632550 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5781f381-0321-47f5-a74c-bd0331daffd9" path="/var/lib/kubelet/pods/5781f381-0321-47f5-a74c-bd0331daffd9/volumes" Mar 19 20:28:10 crc kubenswrapper[5033]: I0319 20:28:10.627984 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:28:10 crc kubenswrapper[5033]: E0319 20:28:10.628684 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:28:23 crc kubenswrapper[5033]: I0319 20:28:23.620057 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:28:23 crc kubenswrapper[5033]: E0319 20:28:23.620754 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:28:38 crc kubenswrapper[5033]: I0319 20:28:38.621088 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:28:38 crc kubenswrapper[5033]: E0319 20:28:38.621876 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:28:42 crc kubenswrapper[5033]: I0319 20:28:42.875758 5033 scope.go:117] "RemoveContainer" containerID="7e61c18abce6cc0dac1834bd702fda35fa76b6e2a06f3515d98c4bffb602e93f" Mar 19 20:28:50 crc kubenswrapper[5033]: I0319 20:28:50.628256 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:28:50 crc kubenswrapper[5033]: E0319 20:28:50.629508 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:29:01 crc kubenswrapper[5033]: I0319 20:29:01.622365 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:29:01 crc kubenswrapper[5033]: E0319 20:29:01.623079 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:29:13 crc kubenswrapper[5033]: I0319 20:29:13.620833 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:29:13 crc kubenswrapper[5033]: E0319 20:29:13.621602 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:29:26 crc kubenswrapper[5033]: I0319 20:29:26.621117 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:29:26 crc kubenswrapper[5033]: E0319 20:29:26.621933 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:29:39 crc kubenswrapper[5033]: I0319 20:29:39.620996 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:29:39 crc kubenswrapper[5033]: E0319 20:29:39.621825 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:29:50 crc kubenswrapper[5033]: I0319 20:29:50.626887 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:29:50 crc kubenswrapper[5033]: E0319 20:29:50.627659 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.148745 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565870-v9bgp"] Mar 19 20:30:00 crc kubenswrapper[5033]: E0319 20:30:00.149886 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822c7653-7791-42b9-823c-cc2ce1e27863" containerName="oc" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.149907 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="822c7653-7791-42b9-823c-cc2ce1e27863" containerName="oc" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.150176 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="822c7653-7791-42b9-823c-cc2ce1e27863" containerName="oc" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.151147 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.154224 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.154699 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-56vk8" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.154911 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.157899 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz"] Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.159356 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.161120 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.162105 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.169670 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-v9bgp"] Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.191987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz"] Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.227744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ntb\" (UniqueName: \"kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.227841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.227884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.227948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72d7h\" (UniqueName: \"kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h\") pod \"auto-csr-approver-29565870-v9bgp\" (UID: \"c114afb4-7cf2-4bb2-97be-c3234b813230\") " pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.329689 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72d7h\" (UniqueName: \"kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h\") pod \"auto-csr-approver-29565870-v9bgp\" (UID: \"c114afb4-7cf2-4bb2-97be-c3234b813230\") " pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.329788 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65ntb\" (UniqueName: \"kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.329846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.329882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.331221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.336907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.351961 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72d7h\" (UniqueName: \"kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h\") pod \"auto-csr-approver-29565870-v9bgp\" (UID: \"c114afb4-7cf2-4bb2-97be-c3234b813230\") " pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.361216 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ntb\" (UniqueName: \"kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb\") pod \"collect-profiles-29565870-m7snz\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.470890 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:00 crc kubenswrapper[5033]: I0319 20:30:00.482306 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:01 crc kubenswrapper[5033]: I0319 20:30:01.289779 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz"] Mar 19 20:30:01 crc kubenswrapper[5033]: I0319 20:30:01.340814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" event={"ID":"94f9fd0e-779f-4179-aa97-2598a9ef72f1","Type":"ContainerStarted","Data":"eb6cd65611e5f927bfe564b7bb439adb95cfc2b170c975af17c25595e86808b3"} Mar 19 20:30:01 crc kubenswrapper[5033]: I0319 20:30:01.602608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-v9bgp"] Mar 19 20:30:01 crc kubenswrapper[5033]: I0319 20:30:01.602715 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:30:02 crc kubenswrapper[5033]: I0319 20:30:02.350050 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" event={"ID":"c114afb4-7cf2-4bb2-97be-c3234b813230","Type":"ContainerStarted","Data":"3c9b3ae9b46a2a0d06b7594a0651c3fcfd34bb52be2a7ac5cd265bb09e607781"} Mar 19 20:30:02 crc kubenswrapper[5033]: I0319 20:30:02.351749 5033 generic.go:334] "Generic (PLEG): container finished" podID="94f9fd0e-779f-4179-aa97-2598a9ef72f1" containerID="48f5a8129e56a85b546cf0409fd327d80c821e4d6932049044b5904cb6a6b7b7" exitCode=0 Mar 19 20:30:02 crc kubenswrapper[5033]: I0319 20:30:02.351780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" event={"ID":"94f9fd0e-779f-4179-aa97-2598a9ef72f1","Type":"ContainerDied","Data":"48f5a8129e56a85b546cf0409fd327d80c821e4d6932049044b5904cb6a6b7b7"} Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.385164 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" event={"ID":"94f9fd0e-779f-4179-aa97-2598a9ef72f1","Type":"ContainerDied","Data":"eb6cd65611e5f927bfe564b7bb439adb95cfc2b170c975af17c25595e86808b3"} Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.385699 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb6cd65611e5f927bfe564b7bb439adb95cfc2b170c975af17c25595e86808b3" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.514034 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.620123 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:30:04 crc kubenswrapper[5033]: E0319 20:30:04.620349 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.620838 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65ntb\" (UniqueName: \"kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb\") pod \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.620989 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume\") pod \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.621021 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume\") pod \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\" (UID: \"94f9fd0e-779f-4179-aa97-2598a9ef72f1\") " Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.622042 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "94f9fd0e-779f-4179-aa97-2598a9ef72f1" (UID: "94f9fd0e-779f-4179-aa97-2598a9ef72f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.628955 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb" (OuterVolumeSpecName: "kube-api-access-65ntb") pod "94f9fd0e-779f-4179-aa97-2598a9ef72f1" (UID: "94f9fd0e-779f-4179-aa97-2598a9ef72f1"). InnerVolumeSpecName "kube-api-access-65ntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.634626 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94f9fd0e-779f-4179-aa97-2598a9ef72f1" (UID: "94f9fd0e-779f-4179-aa97-2598a9ef72f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.724085 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65ntb\" (UniqueName: \"kubernetes.io/projected/94f9fd0e-779f-4179-aa97-2598a9ef72f1-kube-api-access-65ntb\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.724116 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94f9fd0e-779f-4179-aa97-2598a9ef72f1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:04 crc kubenswrapper[5033]: I0319 20:30:04.724126 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94f9fd0e-779f-4179-aa97-2598a9ef72f1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:05 crc kubenswrapper[5033]: I0319 20:30:05.396705 5033 generic.go:334] "Generic (PLEG): container finished" podID="c114afb4-7cf2-4bb2-97be-c3234b813230" containerID="4b2e0f743abaca9dac99fcc2887c73fb256073bfc984f3f20f05dc8ac31fec58" exitCode=0 Mar 19 20:30:05 crc kubenswrapper[5033]: I0319 20:30:05.396813 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" event={"ID":"c114afb4-7cf2-4bb2-97be-c3234b813230","Type":"ContainerDied","Data":"4b2e0f743abaca9dac99fcc2887c73fb256073bfc984f3f20f05dc8ac31fec58"} Mar 19 20:30:05 crc kubenswrapper[5033]: I0319 20:30:05.396987 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-m7snz" Mar 19 20:30:05 crc kubenswrapper[5033]: I0319 20:30:05.612709 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc"] Mar 19 20:30:05 crc kubenswrapper[5033]: I0319 20:30:05.626325 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-2p6gc"] Mar 19 20:30:06 crc kubenswrapper[5033]: I0319 20:30:06.630913 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756ad68c-ccdb-41db-80bc-923f130791d6" path="/var/lib/kubelet/pods/756ad68c-ccdb-41db-80bc-923f130791d6/volumes" Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.414281 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" event={"ID":"c114afb4-7cf2-4bb2-97be-c3234b813230","Type":"ContainerDied","Data":"3c9b3ae9b46a2a0d06b7594a0651c3fcfd34bb52be2a7ac5cd265bb09e607781"} Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.414589 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9b3ae9b46a2a0d06b7594a0651c3fcfd34bb52be2a7ac5cd265bb09e607781" Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.435384 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.483589 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72d7h\" (UniqueName: \"kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h\") pod \"c114afb4-7cf2-4bb2-97be-c3234b813230\" (UID: \"c114afb4-7cf2-4bb2-97be-c3234b813230\") " Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.501932 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h" (OuterVolumeSpecName: "kube-api-access-72d7h") pod "c114afb4-7cf2-4bb2-97be-c3234b813230" (UID: "c114afb4-7cf2-4bb2-97be-c3234b813230"). InnerVolumeSpecName "kube-api-access-72d7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:07 crc kubenswrapper[5033]: I0319 20:30:07.586593 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72d7h\" (UniqueName: \"kubernetes.io/projected/c114afb4-7cf2-4bb2-97be-c3234b813230-kube-api-access-72d7h\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:08 crc kubenswrapper[5033]: I0319 20:30:08.421589 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-v9bgp" Mar 19 20:30:08 crc kubenswrapper[5033]: I0319 20:30:08.485408 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-x7glf"] Mar 19 20:30:08 crc kubenswrapper[5033]: I0319 20:30:08.494201 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-x7glf"] Mar 19 20:30:08 crc kubenswrapper[5033]: I0319 20:30:08.631864 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27f4e2c-782c-4b13-8c36-d637b5f9f55c" path="/var/lib/kubelet/pods/c27f4e2c-782c-4b13-8c36-d637b5f9f55c/volumes" Mar 19 20:30:18 crc kubenswrapper[5033]: I0319 20:30:18.620676 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:30:18 crc kubenswrapper[5033]: E0319 20:30:18.621418 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:30:29 crc kubenswrapper[5033]: I0319 20:30:29.621648 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:30:29 crc kubenswrapper[5033]: E0319 20:30:29.622436 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-779xw_openshift-machine-config-operator(c960a9d1-3c99-4e77-9906-e319e0aed817)\"" pod="openshift-machine-config-operator/machine-config-daemon-779xw" podUID="c960a9d1-3c99-4e77-9906-e319e0aed817" Mar 19 20:30:41 crc kubenswrapper[5033]: I0319 20:30:41.621401 5033 scope.go:117] "RemoveContainer" containerID="726c99b00898629e942efe3c57e6a81120e82061fb2d8f310ddb6fe8c5df7646" Mar 19 20:30:42 crc kubenswrapper[5033]: I0319 20:30:42.758646 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-779xw" event={"ID":"c960a9d1-3c99-4e77-9906-e319e0aed817","Type":"ContainerStarted","Data":"f1c4b86d2f0a005757f975b262c1e29133e56243859979ad2b03c4c5fafeea24"} Mar 19 20:30:42 crc kubenswrapper[5033]: I0319 20:30:42.984084 5033 scope.go:117] "RemoveContainer" containerID="59f8423300628e3af197a9bcb4e7b30069e04ca5b7b330029b455075319acbcc" Mar 19 20:30:43 crc kubenswrapper[5033]: I0319 20:30:43.016042 5033 scope.go:117] "RemoveContainer" containerID="90804c370e7cf0012f8cd253d460268e8cbb78afd0e7e3a1b65df583ecc71a04"